Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSun, 25 Nov 2007 10:01:57 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/25/t11960096486d6h82gbic1ph2h.htm/, Retrieved Sat, 04 May 2024 18:15:28 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=6501, Retrieved Sat, 04 May 2024 18:15:28 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact168
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Workshop 3 Q3 ass...] [2007-11-25 17:01:57] [44cf2be50bc8700e14714598feda9df9] [Current]
Feedback Forum

Post a new message
Dataseries X:
15761.3	0
16943.0	0
15070.3	0
13659.6	0
14768.9	0
14725.1	0
15998.1	0
15370.6	0
14956.9	0
15469.7	0
15101.8	0
11703.7	0
16283.6	0
16726.5	0
14968.9	0
14861.0	1
14583.3	1
15305.8	1
17903.9	1
16379.4	0
15420.3	0
17870.5	1
15912.8	1
13866.5	1
17823.2	1
17872.0	1
17420.4	1
16704.4	1
15991.2	1
16583.6	1
19123.5	1
17838.7	1
17209.4	1
18586.5	1
16258.1	1
15141.6	1
19202.1	1
17746.5	1
19090.1	0
18040.3	0
17515.5	1
17751.8	0
21072.4	0
17170.0	1
19439.5	1
19795.4	1
17574.9	1
16165.4	1
19464.6	1
19932.1	1
19961.2	1
17343.4	1
18924.2	1
18574.1	1
21350.6	1
18840.1	1
20304.8	1
21132.4	1
19753.9	1
18009.9	1
20390.4	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6501&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6501&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6501&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 15964.8714285714 + 1921.31357142857x[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  15964.8714285714 +  1921.31357142857x[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6501&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  15964.8714285714 +  1921.31357142857x[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6501&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6501&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 15964.8714285714 + 1921.31357142857x[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)15964.8714285714407.61206339.166800
x1921.31357142857503.3637663.81690.0003260.000163

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 15964.8714285714 & 407.612063 & 39.1668 & 0 & 0 \tabularnewline
x & 1921.31357142857 & 503.363766 & 3.8169 & 0.000326 & 0.000163 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6501&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]15964.8714285714[/C][C]407.612063[/C][C]39.1668[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]1921.31357142857[/C][C]503.363766[/C][C]3.8169[/C][C]0.000326[/C][C]0.000163[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6501&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6501&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)15964.8714285714407.61206339.166800
x1921.31357142857503.3637663.81690.0003260.000163







Multiple Linear Regression - Regression Statistics
Multiple R0.44500879133961
R-squared0.198032824369540
Adjusted R-squared0.184440160375804
F-TEST (value)14.5690958343995
F-TEST (DF numerator)1
F-TEST (DF denominator)59
p-value0.000326356023198859
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1867.91313463446
Sum Squared Residuals205856869.233857

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.44500879133961 \tabularnewline
R-squared & 0.198032824369540 \tabularnewline
Adjusted R-squared & 0.184440160375804 \tabularnewline
F-TEST (value) & 14.5690958343995 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 59 \tabularnewline
p-value & 0.000326356023198859 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 1867.91313463446 \tabularnewline
Sum Squared Residuals & 205856869.233857 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6501&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.44500879133961[/C][/ROW]
[ROW][C]R-squared[/C][C]0.198032824369540[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.184440160375804[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]14.5690958343995[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]59[/C][/ROW]
[ROW][C]p-value[/C][C]0.000326356023198859[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]1867.91313463446[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]205856869.233857[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6501&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6501&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.44500879133961
R-squared0.198032824369540
Adjusted R-squared0.184440160375804
F-TEST (value)14.5690958343995
F-TEST (DF numerator)1
F-TEST (DF denominator)59
p-value0.000326356023198859
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1867.91313463446
Sum Squared Residuals205856869.233857







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
115761.315964.8714285714-203.571428571383
21694315964.8714285714978.128571428568
315070.315964.8714285714-894.571428571431
413659.615964.8714285714-2305.27142857143
514768.915964.8714285714-1195.97142857143
614725.115964.8714285714-1239.77142857143
715998.115964.871428571433.2285714285698
815370.615964.8714285714-594.27142857143
914956.915964.8714285714-1007.97142857143
1015469.715964.8714285714-495.17142857143
1115101.815964.8714285714-863.071428571431
1211703.715964.8714285714-4261.17142857143
1316283.615964.8714285714318.72857142857
1416726.515964.8714285714761.62857142857
1514968.915964.8714285714-995.97142857143
161486117886.185-3025.185
1714583.317886.185-3302.885
1815305.817886.185-2580.385
1917903.917886.18517.7150000000012
2016379.415964.8714285714414.528571428569
2115420.315964.8714285714-544.571428571431
2217870.517886.185-15.6850000000003
2315912.817886.185-1973.385
2413866.517886.185-4019.685
2517823.217886.185-62.9849999999996
261787217886.185-14.1850000000003
2717420.417886.185-465.784999999999
2816704.417886.185-1181.785
2915991.217886.185-1894.985
3016583.617886.185-1302.58500000000
3119123.517886.1851237.315
3217838.717886.185-47.4849999999996
3317209.417886.185-676.784999999999
3418586.517886.185700.315
3516258.117886.185-1628.085
3615141.617886.185-2744.585
3719202.117886.1851315.915
3817746.517886.185-139.685000000000
3919090.115964.87142857143125.22857142857
4018040.315964.87142857142075.42857142857
4117515.517886.185-370.685
4217751.815964.87142857141786.92857142857
4321072.415964.87142857145107.52857142857
441717017886.185-716.185
4519439.517886.1851553.315
4619795.417886.1851909.215
4717574.917886.185-311.284999999999
4816165.417886.185-1720.785
4919464.617886.1851578.41500000000
5019932.117886.1852045.91500000000
5119961.217886.1852075.015
5217343.417886.185-542.784999999999
5318924.217886.1851038.015
5418574.117886.185687.914999999998
5521350.617886.1853464.415
5618840.117886.185953.914999999998
5720304.817886.1852418.615
5821132.417886.1853246.215
5919753.917886.1851867.715
6018009.917886.185123.715000000001
6120390.417886.1852504.215

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 15761.3 & 15964.8714285714 & -203.571428571383 \tabularnewline
2 & 16943 & 15964.8714285714 & 978.128571428568 \tabularnewline
3 & 15070.3 & 15964.8714285714 & -894.571428571431 \tabularnewline
4 & 13659.6 & 15964.8714285714 & -2305.27142857143 \tabularnewline
5 & 14768.9 & 15964.8714285714 & -1195.97142857143 \tabularnewline
6 & 14725.1 & 15964.8714285714 & -1239.77142857143 \tabularnewline
7 & 15998.1 & 15964.8714285714 & 33.2285714285698 \tabularnewline
8 & 15370.6 & 15964.8714285714 & -594.27142857143 \tabularnewline
9 & 14956.9 & 15964.8714285714 & -1007.97142857143 \tabularnewline
10 & 15469.7 & 15964.8714285714 & -495.17142857143 \tabularnewline
11 & 15101.8 & 15964.8714285714 & -863.071428571431 \tabularnewline
12 & 11703.7 & 15964.8714285714 & -4261.17142857143 \tabularnewline
13 & 16283.6 & 15964.8714285714 & 318.72857142857 \tabularnewline
14 & 16726.5 & 15964.8714285714 & 761.62857142857 \tabularnewline
15 & 14968.9 & 15964.8714285714 & -995.97142857143 \tabularnewline
16 & 14861 & 17886.185 & -3025.185 \tabularnewline
17 & 14583.3 & 17886.185 & -3302.885 \tabularnewline
18 & 15305.8 & 17886.185 & -2580.385 \tabularnewline
19 & 17903.9 & 17886.185 & 17.7150000000012 \tabularnewline
20 & 16379.4 & 15964.8714285714 & 414.528571428569 \tabularnewline
21 & 15420.3 & 15964.8714285714 & -544.571428571431 \tabularnewline
22 & 17870.5 & 17886.185 & -15.6850000000003 \tabularnewline
23 & 15912.8 & 17886.185 & -1973.385 \tabularnewline
24 & 13866.5 & 17886.185 & -4019.685 \tabularnewline
25 & 17823.2 & 17886.185 & -62.9849999999996 \tabularnewline
26 & 17872 & 17886.185 & -14.1850000000003 \tabularnewline
27 & 17420.4 & 17886.185 & -465.784999999999 \tabularnewline
28 & 16704.4 & 17886.185 & -1181.785 \tabularnewline
29 & 15991.2 & 17886.185 & -1894.985 \tabularnewline
30 & 16583.6 & 17886.185 & -1302.58500000000 \tabularnewline
31 & 19123.5 & 17886.185 & 1237.315 \tabularnewline
32 & 17838.7 & 17886.185 & -47.4849999999996 \tabularnewline
33 & 17209.4 & 17886.185 & -676.784999999999 \tabularnewline
34 & 18586.5 & 17886.185 & 700.315 \tabularnewline
35 & 16258.1 & 17886.185 & -1628.085 \tabularnewline
36 & 15141.6 & 17886.185 & -2744.585 \tabularnewline
37 & 19202.1 & 17886.185 & 1315.915 \tabularnewline
38 & 17746.5 & 17886.185 & -139.685000000000 \tabularnewline
39 & 19090.1 & 15964.8714285714 & 3125.22857142857 \tabularnewline
40 & 18040.3 & 15964.8714285714 & 2075.42857142857 \tabularnewline
41 & 17515.5 & 17886.185 & -370.685 \tabularnewline
42 & 17751.8 & 15964.8714285714 & 1786.92857142857 \tabularnewline
43 & 21072.4 & 15964.8714285714 & 5107.52857142857 \tabularnewline
44 & 17170 & 17886.185 & -716.185 \tabularnewline
45 & 19439.5 & 17886.185 & 1553.315 \tabularnewline
46 & 19795.4 & 17886.185 & 1909.215 \tabularnewline
47 & 17574.9 & 17886.185 & -311.284999999999 \tabularnewline
48 & 16165.4 & 17886.185 & -1720.785 \tabularnewline
49 & 19464.6 & 17886.185 & 1578.41500000000 \tabularnewline
50 & 19932.1 & 17886.185 & 2045.91500000000 \tabularnewline
51 & 19961.2 & 17886.185 & 2075.015 \tabularnewline
52 & 17343.4 & 17886.185 & -542.784999999999 \tabularnewline
53 & 18924.2 & 17886.185 & 1038.015 \tabularnewline
54 & 18574.1 & 17886.185 & 687.914999999998 \tabularnewline
55 & 21350.6 & 17886.185 & 3464.415 \tabularnewline
56 & 18840.1 & 17886.185 & 953.914999999998 \tabularnewline
57 & 20304.8 & 17886.185 & 2418.615 \tabularnewline
58 & 21132.4 & 17886.185 & 3246.215 \tabularnewline
59 & 19753.9 & 17886.185 & 1867.715 \tabularnewline
60 & 18009.9 & 17886.185 & 123.715000000001 \tabularnewline
61 & 20390.4 & 17886.185 & 2504.215 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6501&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]15761.3[/C][C]15964.8714285714[/C][C]-203.571428571383[/C][/ROW]
[ROW][C]2[/C][C]16943[/C][C]15964.8714285714[/C][C]978.128571428568[/C][/ROW]
[ROW][C]3[/C][C]15070.3[/C][C]15964.8714285714[/C][C]-894.571428571431[/C][/ROW]
[ROW][C]4[/C][C]13659.6[/C][C]15964.8714285714[/C][C]-2305.27142857143[/C][/ROW]
[ROW][C]5[/C][C]14768.9[/C][C]15964.8714285714[/C][C]-1195.97142857143[/C][/ROW]
[ROW][C]6[/C][C]14725.1[/C][C]15964.8714285714[/C][C]-1239.77142857143[/C][/ROW]
[ROW][C]7[/C][C]15998.1[/C][C]15964.8714285714[/C][C]33.2285714285698[/C][/ROW]
[ROW][C]8[/C][C]15370.6[/C][C]15964.8714285714[/C][C]-594.27142857143[/C][/ROW]
[ROW][C]9[/C][C]14956.9[/C][C]15964.8714285714[/C][C]-1007.97142857143[/C][/ROW]
[ROW][C]10[/C][C]15469.7[/C][C]15964.8714285714[/C][C]-495.17142857143[/C][/ROW]
[ROW][C]11[/C][C]15101.8[/C][C]15964.8714285714[/C][C]-863.071428571431[/C][/ROW]
[ROW][C]12[/C][C]11703.7[/C][C]15964.8714285714[/C][C]-4261.17142857143[/C][/ROW]
[ROW][C]13[/C][C]16283.6[/C][C]15964.8714285714[/C][C]318.72857142857[/C][/ROW]
[ROW][C]14[/C][C]16726.5[/C][C]15964.8714285714[/C][C]761.62857142857[/C][/ROW]
[ROW][C]15[/C][C]14968.9[/C][C]15964.8714285714[/C][C]-995.97142857143[/C][/ROW]
[ROW][C]16[/C][C]14861[/C][C]17886.185[/C][C]-3025.185[/C][/ROW]
[ROW][C]17[/C][C]14583.3[/C][C]17886.185[/C][C]-3302.885[/C][/ROW]
[ROW][C]18[/C][C]15305.8[/C][C]17886.185[/C][C]-2580.385[/C][/ROW]
[ROW][C]19[/C][C]17903.9[/C][C]17886.185[/C][C]17.7150000000012[/C][/ROW]
[ROW][C]20[/C][C]16379.4[/C][C]15964.8714285714[/C][C]414.528571428569[/C][/ROW]
[ROW][C]21[/C][C]15420.3[/C][C]15964.8714285714[/C][C]-544.571428571431[/C][/ROW]
[ROW][C]22[/C][C]17870.5[/C][C]17886.185[/C][C]-15.6850000000003[/C][/ROW]
[ROW][C]23[/C][C]15912.8[/C][C]17886.185[/C][C]-1973.385[/C][/ROW]
[ROW][C]24[/C][C]13866.5[/C][C]17886.185[/C][C]-4019.685[/C][/ROW]
[ROW][C]25[/C][C]17823.2[/C][C]17886.185[/C][C]-62.9849999999996[/C][/ROW]
[ROW][C]26[/C][C]17872[/C][C]17886.185[/C][C]-14.1850000000003[/C][/ROW]
[ROW][C]27[/C][C]17420.4[/C][C]17886.185[/C][C]-465.784999999999[/C][/ROW]
[ROW][C]28[/C][C]16704.4[/C][C]17886.185[/C][C]-1181.785[/C][/ROW]
[ROW][C]29[/C][C]15991.2[/C][C]17886.185[/C][C]-1894.985[/C][/ROW]
[ROW][C]30[/C][C]16583.6[/C][C]17886.185[/C][C]-1302.58500000000[/C][/ROW]
[ROW][C]31[/C][C]19123.5[/C][C]17886.185[/C][C]1237.315[/C][/ROW]
[ROW][C]32[/C][C]17838.7[/C][C]17886.185[/C][C]-47.4849999999996[/C][/ROW]
[ROW][C]33[/C][C]17209.4[/C][C]17886.185[/C][C]-676.784999999999[/C][/ROW]
[ROW][C]34[/C][C]18586.5[/C][C]17886.185[/C][C]700.315[/C][/ROW]
[ROW][C]35[/C][C]16258.1[/C][C]17886.185[/C][C]-1628.085[/C][/ROW]
[ROW][C]36[/C][C]15141.6[/C][C]17886.185[/C][C]-2744.585[/C][/ROW]
[ROW][C]37[/C][C]19202.1[/C][C]17886.185[/C][C]1315.915[/C][/ROW]
[ROW][C]38[/C][C]17746.5[/C][C]17886.185[/C][C]-139.685000000000[/C][/ROW]
[ROW][C]39[/C][C]19090.1[/C][C]15964.8714285714[/C][C]3125.22857142857[/C][/ROW]
[ROW][C]40[/C][C]18040.3[/C][C]15964.8714285714[/C][C]2075.42857142857[/C][/ROW]
[ROW][C]41[/C][C]17515.5[/C][C]17886.185[/C][C]-370.685[/C][/ROW]
[ROW][C]42[/C][C]17751.8[/C][C]15964.8714285714[/C][C]1786.92857142857[/C][/ROW]
[ROW][C]43[/C][C]21072.4[/C][C]15964.8714285714[/C][C]5107.52857142857[/C][/ROW]
[ROW][C]44[/C][C]17170[/C][C]17886.185[/C][C]-716.185[/C][/ROW]
[ROW][C]45[/C][C]19439.5[/C][C]17886.185[/C][C]1553.315[/C][/ROW]
[ROW][C]46[/C][C]19795.4[/C][C]17886.185[/C][C]1909.215[/C][/ROW]
[ROW][C]47[/C][C]17574.9[/C][C]17886.185[/C][C]-311.284999999999[/C][/ROW]
[ROW][C]48[/C][C]16165.4[/C][C]17886.185[/C][C]-1720.785[/C][/ROW]
[ROW][C]49[/C][C]19464.6[/C][C]17886.185[/C][C]1578.41500000000[/C][/ROW]
[ROW][C]50[/C][C]19932.1[/C][C]17886.185[/C][C]2045.91500000000[/C][/ROW]
[ROW][C]51[/C][C]19961.2[/C][C]17886.185[/C][C]2075.015[/C][/ROW]
[ROW][C]52[/C][C]17343.4[/C][C]17886.185[/C][C]-542.784999999999[/C][/ROW]
[ROW][C]53[/C][C]18924.2[/C][C]17886.185[/C][C]1038.015[/C][/ROW]
[ROW][C]54[/C][C]18574.1[/C][C]17886.185[/C][C]687.914999999998[/C][/ROW]
[ROW][C]55[/C][C]21350.6[/C][C]17886.185[/C][C]3464.415[/C][/ROW]
[ROW][C]56[/C][C]18840.1[/C][C]17886.185[/C][C]953.914999999998[/C][/ROW]
[ROW][C]57[/C][C]20304.8[/C][C]17886.185[/C][C]2418.615[/C][/ROW]
[ROW][C]58[/C][C]21132.4[/C][C]17886.185[/C][C]3246.215[/C][/ROW]
[ROW][C]59[/C][C]19753.9[/C][C]17886.185[/C][C]1867.715[/C][/ROW]
[ROW][C]60[/C][C]18009.9[/C][C]17886.185[/C][C]123.715000000001[/C][/ROW]
[ROW][C]61[/C][C]20390.4[/C][C]17886.185[/C][C]2504.215[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6501&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6501&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
115761.315964.8714285714-203.571428571383
21694315964.8714285714978.128571428568
315070.315964.8714285714-894.571428571431
413659.615964.8714285714-2305.27142857143
514768.915964.8714285714-1195.97142857143
614725.115964.8714285714-1239.77142857143
715998.115964.871428571433.2285714285698
815370.615964.8714285714-594.27142857143
914956.915964.8714285714-1007.97142857143
1015469.715964.8714285714-495.17142857143
1115101.815964.8714285714-863.071428571431
1211703.715964.8714285714-4261.17142857143
1316283.615964.8714285714318.72857142857
1416726.515964.8714285714761.62857142857
1514968.915964.8714285714-995.97142857143
161486117886.185-3025.185
1714583.317886.185-3302.885
1815305.817886.185-2580.385
1917903.917886.18517.7150000000012
2016379.415964.8714285714414.528571428569
2115420.315964.8714285714-544.571428571431
2217870.517886.185-15.6850000000003
2315912.817886.185-1973.385
2413866.517886.185-4019.685
2517823.217886.185-62.9849999999996
261787217886.185-14.1850000000003
2717420.417886.185-465.784999999999
2816704.417886.185-1181.785
2915991.217886.185-1894.985
3016583.617886.185-1302.58500000000
3119123.517886.1851237.315
3217838.717886.185-47.4849999999996
3317209.417886.185-676.784999999999
3418586.517886.185700.315
3516258.117886.185-1628.085
3615141.617886.185-2744.585
3719202.117886.1851315.915
3817746.517886.185-139.685000000000
3919090.115964.87142857143125.22857142857
4018040.315964.87142857142075.42857142857
4117515.517886.185-370.685
4217751.815964.87142857141786.92857142857
4321072.415964.87142857145107.52857142857
441717017886.185-716.185
4519439.517886.1851553.315
4619795.417886.1851909.215
4717574.917886.185-311.284999999999
4816165.417886.185-1720.785
4919464.617886.1851578.41500000000
5019932.117886.1852045.91500000000
5119961.217886.1852075.015
5217343.417886.185-542.784999999999
5318924.217886.1851038.015
5418574.117886.185687.914999999998
5521350.617886.1853464.415
5618840.117886.185953.914999999998
5720304.817886.1852418.615
5821132.417886.1853246.215
5919753.917886.1851867.715
6018009.917886.185123.715000000001
6120390.417886.1852504.215



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')