Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 10:17:30 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t1195492762cvaclr5nal2rjic.htm/, Retrieved Fri, 03 May 2024 10:10:48 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5745, Retrieved Fri, 03 May 2024 10:10:48 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact178
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Vraag 3 deel 1] [2007-11-19 17:17:30] [c40c597932a04e0e43159741c7e63e4c] [Current]
Feedback Forum

Post a new message
Dataseries X:
103,6500	0
103,8700	0
103,9400	0
105,3200	0
105,5400	0
106,0800	0
106,2100	0
105,5300	0
105,5600	0
105,1400	0
105,9700	0
105,4500	0
106,2200	0
106,3100	0
107,3800	0
109,3100	0
110,8200	0
111,2200	0
110,6600	0
110,7600	0
110,6900	0
111,0800	0
110,9700	0
110,2400	0
112,5100	1
111,5200	1
112,1300	1
112,2300	1
112,9200	1
111,8900	1
111,9900	1
111,5100	1
112,3300	1
112,0400	1
112,0900	1
111,4100	1
112,6100	1
113,1400	1
113,6500	1
114,2600	1
114,4000	1
114,9300	1
114,8600	1
114,9500	1
116,1700	1
114,6000	1
114,6200	1
113,8200	1
115,0200	1
115,1800	1
115,5900	1
116,6000	1
117,0700	1
116,9600	1
116,6600	1
116,0700	1
116,0400	1
115,8100	1
116,2200	1
115,8500	1
116,4300	1
117,3900	1
119,1700	1
119,2400	1
120,0300	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5745&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5745&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5745&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 107.413333333333 + 7.26739837398373x[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  107.413333333333 +  7.26739837398373x[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5745&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  107.413333333333 +  7.26739837398373x[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5745&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5745&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 107.413333333333 + 7.26739837398373x[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)107.4133333333330.495847216.62600
x7.267398373983730.62432711.640400

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 107.413333333333 & 0.495847 & 216.626 & 0 & 0 \tabularnewline
x & 7.26739837398373 & 0.624327 & 11.6404 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5745&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]107.413333333333[/C][C]0.495847[/C][C]216.626[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]7.26739837398373[/C][C]0.624327[/C][C]11.6404[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5745&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5745&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)107.4133333333330.495847216.62600
x7.267398373983730.62432711.640400







Multiple Linear Regression - Regression Statistics
Multiple R0.826206154961996
R-squared0.682616610497086
Adjusted R-squared0.677578778917675
F-TEST (value)135.498100668314
F-TEST (DF numerator)1
F-TEST (DF denominator)63
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.42914385395515
Sum Squared Residuals371.74661138211

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.826206154961996 \tabularnewline
R-squared & 0.682616610497086 \tabularnewline
Adjusted R-squared & 0.677578778917675 \tabularnewline
F-TEST (value) & 135.498100668314 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 63 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 2.42914385395515 \tabularnewline
Sum Squared Residuals & 371.74661138211 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5745&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.826206154961996[/C][/ROW]
[ROW][C]R-squared[/C][C]0.682616610497086[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.677578778917675[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]135.498100668314[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]63[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]2.42914385395515[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]371.74661138211[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5745&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5745&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.826206154961996
R-squared0.682616610497086
Adjusted R-squared0.677578778917675
F-TEST (value)135.498100668314
F-TEST (DF numerator)1
F-TEST (DF denominator)63
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.42914385395515
Sum Squared Residuals371.74661138211







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1103.65107.413333333333-3.76333333333286
2103.87107.413333333333-3.54333333333337
3103.94107.413333333333-3.47333333333335
4105.32107.413333333333-2.09333333333336
5105.54107.413333333333-1.87333333333335
6106.08107.413333333333-1.33333333333335
7106.21107.413333333333-1.20333333333336
8105.53107.413333333333-1.88333333333335
9105.56107.413333333333-1.85333333333335
10105.14107.413333333333-2.27333333333335
11105.97107.413333333333-1.44333333333335
12105.45107.413333333333-1.96333333333335
13106.22107.413333333333-1.19333333333335
14106.31107.413333333333-1.10333333333335
15107.38107.413333333333-0.0333333333333569
16109.31107.4133333333331.89666666666665
17110.82107.4133333333333.40666666666664
18111.22107.4133333333333.80666666666665
19110.66107.4133333333333.24666666666664
20110.76107.4133333333333.34666666666665
21110.69107.4133333333333.27666666666665
22111.08107.4133333333333.66666666666665
23110.97107.4133333333333.55666666666665
24110.24107.4133333333332.82666666666664
25112.51114.680731707317-2.17073170731707
26111.52114.680731707317-3.16073170731708
27112.13114.680731707317-2.55073170731708
28112.23114.680731707317-2.45073170731707
29112.92114.680731707317-1.76073170731707
30111.89114.680731707317-2.79073170731707
31111.99114.680731707317-2.69073170731708
32111.51114.680731707317-3.17073170731707
33112.33114.680731707317-2.35073170731707
34112.04114.680731707317-2.64073170731707
35112.09114.680731707317-2.59073170731707
36111.41114.680731707317-3.27073170731708
37112.61114.680731707317-2.07073170731707
38113.14114.680731707317-1.54073170731707
39113.65114.680731707317-1.03073170731707
40114.26114.680731707317-0.420731707317068
41114.4114.680731707317-0.280731707317067
42114.93114.6807317073170.249268292682934
43114.86114.6807317073170.179268292682926
44114.95114.6807317073170.269268292682930
45116.17114.6807317073171.48926829268293
46114.6114.680731707317-0.0807317073170788
47114.62114.680731707317-0.0607317073170686
48113.82114.680731707317-0.86073170731708
49115.02114.6807317073170.339268292682923
50115.18114.6807317073170.499268292682934
51115.59114.6807317073170.90926829268293
52116.6114.6807317073171.91926829268292
53117.07114.6807317073172.38926829268292
54116.96114.6807317073172.27926829268292
55116.66114.6807317073171.97926829268292
56116.07114.6807317073171.38926829268292
57116.04114.6807317073171.35926829268293
58115.81114.6807317073171.12926829268293
59116.22114.6807317073171.53926829268293
60115.85114.6807317073171.16926829268292
61116.43114.6807317073171.74926829268293
62117.39114.6807317073172.70926829268293
63119.17114.6807317073174.48926829268293
64119.24114.6807317073174.55926829268292
65120.03114.6807317073175.34926829268293

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 103.65 & 107.413333333333 & -3.76333333333286 \tabularnewline
2 & 103.87 & 107.413333333333 & -3.54333333333337 \tabularnewline
3 & 103.94 & 107.413333333333 & -3.47333333333335 \tabularnewline
4 & 105.32 & 107.413333333333 & -2.09333333333336 \tabularnewline
5 & 105.54 & 107.413333333333 & -1.87333333333335 \tabularnewline
6 & 106.08 & 107.413333333333 & -1.33333333333335 \tabularnewline
7 & 106.21 & 107.413333333333 & -1.20333333333336 \tabularnewline
8 & 105.53 & 107.413333333333 & -1.88333333333335 \tabularnewline
9 & 105.56 & 107.413333333333 & -1.85333333333335 \tabularnewline
10 & 105.14 & 107.413333333333 & -2.27333333333335 \tabularnewline
11 & 105.97 & 107.413333333333 & -1.44333333333335 \tabularnewline
12 & 105.45 & 107.413333333333 & -1.96333333333335 \tabularnewline
13 & 106.22 & 107.413333333333 & -1.19333333333335 \tabularnewline
14 & 106.31 & 107.413333333333 & -1.10333333333335 \tabularnewline
15 & 107.38 & 107.413333333333 & -0.0333333333333569 \tabularnewline
16 & 109.31 & 107.413333333333 & 1.89666666666665 \tabularnewline
17 & 110.82 & 107.413333333333 & 3.40666666666664 \tabularnewline
18 & 111.22 & 107.413333333333 & 3.80666666666665 \tabularnewline
19 & 110.66 & 107.413333333333 & 3.24666666666664 \tabularnewline
20 & 110.76 & 107.413333333333 & 3.34666666666665 \tabularnewline
21 & 110.69 & 107.413333333333 & 3.27666666666665 \tabularnewline
22 & 111.08 & 107.413333333333 & 3.66666666666665 \tabularnewline
23 & 110.97 & 107.413333333333 & 3.55666666666665 \tabularnewline
24 & 110.24 & 107.413333333333 & 2.82666666666664 \tabularnewline
25 & 112.51 & 114.680731707317 & -2.17073170731707 \tabularnewline
26 & 111.52 & 114.680731707317 & -3.16073170731708 \tabularnewline
27 & 112.13 & 114.680731707317 & -2.55073170731708 \tabularnewline
28 & 112.23 & 114.680731707317 & -2.45073170731707 \tabularnewline
29 & 112.92 & 114.680731707317 & -1.76073170731707 \tabularnewline
30 & 111.89 & 114.680731707317 & -2.79073170731707 \tabularnewline
31 & 111.99 & 114.680731707317 & -2.69073170731708 \tabularnewline
32 & 111.51 & 114.680731707317 & -3.17073170731707 \tabularnewline
33 & 112.33 & 114.680731707317 & -2.35073170731707 \tabularnewline
34 & 112.04 & 114.680731707317 & -2.64073170731707 \tabularnewline
35 & 112.09 & 114.680731707317 & -2.59073170731707 \tabularnewline
36 & 111.41 & 114.680731707317 & -3.27073170731708 \tabularnewline
37 & 112.61 & 114.680731707317 & -2.07073170731707 \tabularnewline
38 & 113.14 & 114.680731707317 & -1.54073170731707 \tabularnewline
39 & 113.65 & 114.680731707317 & -1.03073170731707 \tabularnewline
40 & 114.26 & 114.680731707317 & -0.420731707317068 \tabularnewline
41 & 114.4 & 114.680731707317 & -0.280731707317067 \tabularnewline
42 & 114.93 & 114.680731707317 & 0.249268292682934 \tabularnewline
43 & 114.86 & 114.680731707317 & 0.179268292682926 \tabularnewline
44 & 114.95 & 114.680731707317 & 0.269268292682930 \tabularnewline
45 & 116.17 & 114.680731707317 & 1.48926829268293 \tabularnewline
46 & 114.6 & 114.680731707317 & -0.0807317073170788 \tabularnewline
47 & 114.62 & 114.680731707317 & -0.0607317073170686 \tabularnewline
48 & 113.82 & 114.680731707317 & -0.86073170731708 \tabularnewline
49 & 115.02 & 114.680731707317 & 0.339268292682923 \tabularnewline
50 & 115.18 & 114.680731707317 & 0.499268292682934 \tabularnewline
51 & 115.59 & 114.680731707317 & 0.90926829268293 \tabularnewline
52 & 116.6 & 114.680731707317 & 1.91926829268292 \tabularnewline
53 & 117.07 & 114.680731707317 & 2.38926829268292 \tabularnewline
54 & 116.96 & 114.680731707317 & 2.27926829268292 \tabularnewline
55 & 116.66 & 114.680731707317 & 1.97926829268292 \tabularnewline
56 & 116.07 & 114.680731707317 & 1.38926829268292 \tabularnewline
57 & 116.04 & 114.680731707317 & 1.35926829268293 \tabularnewline
58 & 115.81 & 114.680731707317 & 1.12926829268293 \tabularnewline
59 & 116.22 & 114.680731707317 & 1.53926829268293 \tabularnewline
60 & 115.85 & 114.680731707317 & 1.16926829268292 \tabularnewline
61 & 116.43 & 114.680731707317 & 1.74926829268293 \tabularnewline
62 & 117.39 & 114.680731707317 & 2.70926829268293 \tabularnewline
63 & 119.17 & 114.680731707317 & 4.48926829268293 \tabularnewline
64 & 119.24 & 114.680731707317 & 4.55926829268292 \tabularnewline
65 & 120.03 & 114.680731707317 & 5.34926829268293 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5745&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]103.65[/C][C]107.413333333333[/C][C]-3.76333333333286[/C][/ROW]
[ROW][C]2[/C][C]103.87[/C][C]107.413333333333[/C][C]-3.54333333333337[/C][/ROW]
[ROW][C]3[/C][C]103.94[/C][C]107.413333333333[/C][C]-3.47333333333335[/C][/ROW]
[ROW][C]4[/C][C]105.32[/C][C]107.413333333333[/C][C]-2.09333333333336[/C][/ROW]
[ROW][C]5[/C][C]105.54[/C][C]107.413333333333[/C][C]-1.87333333333335[/C][/ROW]
[ROW][C]6[/C][C]106.08[/C][C]107.413333333333[/C][C]-1.33333333333335[/C][/ROW]
[ROW][C]7[/C][C]106.21[/C][C]107.413333333333[/C][C]-1.20333333333336[/C][/ROW]
[ROW][C]8[/C][C]105.53[/C][C]107.413333333333[/C][C]-1.88333333333335[/C][/ROW]
[ROW][C]9[/C][C]105.56[/C][C]107.413333333333[/C][C]-1.85333333333335[/C][/ROW]
[ROW][C]10[/C][C]105.14[/C][C]107.413333333333[/C][C]-2.27333333333335[/C][/ROW]
[ROW][C]11[/C][C]105.97[/C][C]107.413333333333[/C][C]-1.44333333333335[/C][/ROW]
[ROW][C]12[/C][C]105.45[/C][C]107.413333333333[/C][C]-1.96333333333335[/C][/ROW]
[ROW][C]13[/C][C]106.22[/C][C]107.413333333333[/C][C]-1.19333333333335[/C][/ROW]
[ROW][C]14[/C][C]106.31[/C][C]107.413333333333[/C][C]-1.10333333333335[/C][/ROW]
[ROW][C]15[/C][C]107.38[/C][C]107.413333333333[/C][C]-0.0333333333333569[/C][/ROW]
[ROW][C]16[/C][C]109.31[/C][C]107.413333333333[/C][C]1.89666666666665[/C][/ROW]
[ROW][C]17[/C][C]110.82[/C][C]107.413333333333[/C][C]3.40666666666664[/C][/ROW]
[ROW][C]18[/C][C]111.22[/C][C]107.413333333333[/C][C]3.80666666666665[/C][/ROW]
[ROW][C]19[/C][C]110.66[/C][C]107.413333333333[/C][C]3.24666666666664[/C][/ROW]
[ROW][C]20[/C][C]110.76[/C][C]107.413333333333[/C][C]3.34666666666665[/C][/ROW]
[ROW][C]21[/C][C]110.69[/C][C]107.413333333333[/C][C]3.27666666666665[/C][/ROW]
[ROW][C]22[/C][C]111.08[/C][C]107.413333333333[/C][C]3.66666666666665[/C][/ROW]
[ROW][C]23[/C][C]110.97[/C][C]107.413333333333[/C][C]3.55666666666665[/C][/ROW]
[ROW][C]24[/C][C]110.24[/C][C]107.413333333333[/C][C]2.82666666666664[/C][/ROW]
[ROW][C]25[/C][C]112.51[/C][C]114.680731707317[/C][C]-2.17073170731707[/C][/ROW]
[ROW][C]26[/C][C]111.52[/C][C]114.680731707317[/C][C]-3.16073170731708[/C][/ROW]
[ROW][C]27[/C][C]112.13[/C][C]114.680731707317[/C][C]-2.55073170731708[/C][/ROW]
[ROW][C]28[/C][C]112.23[/C][C]114.680731707317[/C][C]-2.45073170731707[/C][/ROW]
[ROW][C]29[/C][C]112.92[/C][C]114.680731707317[/C][C]-1.76073170731707[/C][/ROW]
[ROW][C]30[/C][C]111.89[/C][C]114.680731707317[/C][C]-2.79073170731707[/C][/ROW]
[ROW][C]31[/C][C]111.99[/C][C]114.680731707317[/C][C]-2.69073170731708[/C][/ROW]
[ROW][C]32[/C][C]111.51[/C][C]114.680731707317[/C][C]-3.17073170731707[/C][/ROW]
[ROW][C]33[/C][C]112.33[/C][C]114.680731707317[/C][C]-2.35073170731707[/C][/ROW]
[ROW][C]34[/C][C]112.04[/C][C]114.680731707317[/C][C]-2.64073170731707[/C][/ROW]
[ROW][C]35[/C][C]112.09[/C][C]114.680731707317[/C][C]-2.59073170731707[/C][/ROW]
[ROW][C]36[/C][C]111.41[/C][C]114.680731707317[/C][C]-3.27073170731708[/C][/ROW]
[ROW][C]37[/C][C]112.61[/C][C]114.680731707317[/C][C]-2.07073170731707[/C][/ROW]
[ROW][C]38[/C][C]113.14[/C][C]114.680731707317[/C][C]-1.54073170731707[/C][/ROW]
[ROW][C]39[/C][C]113.65[/C][C]114.680731707317[/C][C]-1.03073170731707[/C][/ROW]
[ROW][C]40[/C][C]114.26[/C][C]114.680731707317[/C][C]-0.420731707317068[/C][/ROW]
[ROW][C]41[/C][C]114.4[/C][C]114.680731707317[/C][C]-0.280731707317067[/C][/ROW]
[ROW][C]42[/C][C]114.93[/C][C]114.680731707317[/C][C]0.249268292682934[/C][/ROW]
[ROW][C]43[/C][C]114.86[/C][C]114.680731707317[/C][C]0.179268292682926[/C][/ROW]
[ROW][C]44[/C][C]114.95[/C][C]114.680731707317[/C][C]0.269268292682930[/C][/ROW]
[ROW][C]45[/C][C]116.17[/C][C]114.680731707317[/C][C]1.48926829268293[/C][/ROW]
[ROW][C]46[/C][C]114.6[/C][C]114.680731707317[/C][C]-0.0807317073170788[/C][/ROW]
[ROW][C]47[/C][C]114.62[/C][C]114.680731707317[/C][C]-0.0607317073170686[/C][/ROW]
[ROW][C]48[/C][C]113.82[/C][C]114.680731707317[/C][C]-0.86073170731708[/C][/ROW]
[ROW][C]49[/C][C]115.02[/C][C]114.680731707317[/C][C]0.339268292682923[/C][/ROW]
[ROW][C]50[/C][C]115.18[/C][C]114.680731707317[/C][C]0.499268292682934[/C][/ROW]
[ROW][C]51[/C][C]115.59[/C][C]114.680731707317[/C][C]0.90926829268293[/C][/ROW]
[ROW][C]52[/C][C]116.6[/C][C]114.680731707317[/C][C]1.91926829268292[/C][/ROW]
[ROW][C]53[/C][C]117.07[/C][C]114.680731707317[/C][C]2.38926829268292[/C][/ROW]
[ROW][C]54[/C][C]116.96[/C][C]114.680731707317[/C][C]2.27926829268292[/C][/ROW]
[ROW][C]55[/C][C]116.66[/C][C]114.680731707317[/C][C]1.97926829268292[/C][/ROW]
[ROW][C]56[/C][C]116.07[/C][C]114.680731707317[/C][C]1.38926829268292[/C][/ROW]
[ROW][C]57[/C][C]116.04[/C][C]114.680731707317[/C][C]1.35926829268293[/C][/ROW]
[ROW][C]58[/C][C]115.81[/C][C]114.680731707317[/C][C]1.12926829268293[/C][/ROW]
[ROW][C]59[/C][C]116.22[/C][C]114.680731707317[/C][C]1.53926829268293[/C][/ROW]
[ROW][C]60[/C][C]115.85[/C][C]114.680731707317[/C][C]1.16926829268292[/C][/ROW]
[ROW][C]61[/C][C]116.43[/C][C]114.680731707317[/C][C]1.74926829268293[/C][/ROW]
[ROW][C]62[/C][C]117.39[/C][C]114.680731707317[/C][C]2.70926829268293[/C][/ROW]
[ROW][C]63[/C][C]119.17[/C][C]114.680731707317[/C][C]4.48926829268293[/C][/ROW]
[ROW][C]64[/C][C]119.24[/C][C]114.680731707317[/C][C]4.55926829268292[/C][/ROW]
[ROW][C]65[/C][C]120.03[/C][C]114.680731707317[/C][C]5.34926829268293[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5745&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5745&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1103.65107.413333333333-3.76333333333286
2103.87107.413333333333-3.54333333333337
3103.94107.413333333333-3.47333333333335
4105.32107.413333333333-2.09333333333336
5105.54107.413333333333-1.87333333333335
6106.08107.413333333333-1.33333333333335
7106.21107.413333333333-1.20333333333336
8105.53107.413333333333-1.88333333333335
9105.56107.413333333333-1.85333333333335
10105.14107.413333333333-2.27333333333335
11105.97107.413333333333-1.44333333333335
12105.45107.413333333333-1.96333333333335
13106.22107.413333333333-1.19333333333335
14106.31107.413333333333-1.10333333333335
15107.38107.413333333333-0.0333333333333569
16109.31107.4133333333331.89666666666665
17110.82107.4133333333333.40666666666664
18111.22107.4133333333333.80666666666665
19110.66107.4133333333333.24666666666664
20110.76107.4133333333333.34666666666665
21110.69107.4133333333333.27666666666665
22111.08107.4133333333333.66666666666665
23110.97107.4133333333333.55666666666665
24110.24107.4133333333332.82666666666664
25112.51114.680731707317-2.17073170731707
26111.52114.680731707317-3.16073170731708
27112.13114.680731707317-2.55073170731708
28112.23114.680731707317-2.45073170731707
29112.92114.680731707317-1.76073170731707
30111.89114.680731707317-2.79073170731707
31111.99114.680731707317-2.69073170731708
32111.51114.680731707317-3.17073170731707
33112.33114.680731707317-2.35073170731707
34112.04114.680731707317-2.64073170731707
35112.09114.680731707317-2.59073170731707
36111.41114.680731707317-3.27073170731708
37112.61114.680731707317-2.07073170731707
38113.14114.680731707317-1.54073170731707
39113.65114.680731707317-1.03073170731707
40114.26114.680731707317-0.420731707317068
41114.4114.680731707317-0.280731707317067
42114.93114.6807317073170.249268292682934
43114.86114.6807317073170.179268292682926
44114.95114.6807317073170.269268292682930
45116.17114.6807317073171.48926829268293
46114.6114.680731707317-0.0807317073170788
47114.62114.680731707317-0.0607317073170686
48113.82114.680731707317-0.86073170731708
49115.02114.6807317073170.339268292682923
50115.18114.6807317073170.499268292682934
51115.59114.6807317073170.90926829268293
52116.6114.6807317073171.91926829268292
53117.07114.6807317073172.38926829268292
54116.96114.6807317073172.27926829268292
55116.66114.6807317073171.97926829268292
56116.07114.6807317073171.38926829268292
57116.04114.6807317073171.35926829268293
58115.81114.6807317073171.12926829268293
59116.22114.6807317073171.53926829268293
60115.85114.6807317073171.16926829268292
61116.43114.6807317073171.74926829268293
62117.39114.6807317073172.70926829268293
63119.17114.6807317073174.48926829268293
64119.24114.6807317073174.55926829268292
65120.03114.6807317073175.34926829268293



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')