Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 22 Nov 2007 12:11:37 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/22/t11957582703qrvkhufm9c0w8x.htm/, Retrieved Fri, 03 May 2024 01:04:19 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=6092, Retrieved Fri, 03 May 2024 01:04:19 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordsTinne Van der Eycken Workshop 2
Estimated Impact166
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Workshop: Seatbel...] [2007-11-22 19:11:37] [c8635c97647ba59406cb570a9fab7b02] [Current]
Feedback Forum

Post a new message
Dataseries X:
103.4	0
101.87	0
101.11	0
98.47	0
97.8	0
97.37	0
97.29	0
93.06	0
92.39	0
93.73	0
94.81	0
93.24	0
90.09	0
89.86	0
87.92	0
86.3	0
86.5	0
87.93	0
88.6	0
90.08	0
88.84	0
87.91	0
88.31	0
87.77	0
86.11	0
82.8	0
81.65	0
82.36	0
82.91	0
81.99	0
83.32	0
84.12	0
85.66	0
86.67	0
85.31	0
85.13	0
86.6	0
87.92	0
87.19	0
85.56	0
86.21	0
86.16	0
85.04	0
82.01	0
83.05	0
83.34	0
82.87	0
83.18	0
83.97	0
82.98	1
82.2	1
83.68	1
83.49	1
82.68	1
81.56	1
81.19	1
81.07	1
79.81	1
79.72	1
78.32	1
76.6	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6092&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6092&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6092&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 97.0079108280255 + 2.84945063694268X[t] + 0.688398221868353M1[t] + 0.210833067940549M2[t] -0.526450238853509M3[t] -0.93173354564756M4[t] -0.48901685244162M5[t] -0.31030015923567M6[t] -0.0395834660297264M7[t] -0.774866772823781M8[t] -0.33015007961784M9[t] + 0.094566613588108M10[t] + 0.341283306794053M11[t] -0.334716693205945t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  +  97.0079108280255 +  2.84945063694268X[t] +  0.688398221868353M1[t] +  0.210833067940549M2[t] -0.526450238853509M3[t] -0.93173354564756M4[t] -0.48901685244162M5[t] -0.31030015923567M6[t] -0.0395834660297264M7[t] -0.774866772823781M8[t] -0.33015007961784M9[t] +  0.094566613588108M10[t] +  0.341283306794053M11[t] -0.334716693205945t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6092&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  +  97.0079108280255 +  2.84945063694268X[t] +  0.688398221868353M1[t] +  0.210833067940549M2[t] -0.526450238853509M3[t] -0.93173354564756M4[t] -0.48901685244162M5[t] -0.31030015923567M6[t] -0.0395834660297264M7[t] -0.774866772823781M8[t] -0.33015007961784M9[t] +  0.094566613588108M10[t] +  0.341283306794053M11[t] -0.334716693205945t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6092&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6092&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 97.0079108280255 + 2.84945063694268X[t] + 0.688398221868353M1[t] + 0.210833067940549M2[t] -0.526450238853509M3[t] -0.93173354564756M4[t] -0.48901685244162M5[t] -0.31030015923567M6[t] -0.0395834660297264M7[t] -0.774866772823781M8[t] -0.33015007961784M9[t] + 0.094566613588108M10[t] + 0.341283306794053M11[t] -0.334716693205945t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)97.00791082802551.75145155.387200
X2.849450636942681.4423621.97550.0540950.027047
M10.6883982218683531.9431260.35430.7247180.362359
M20.2108330679405492.0513330.10280.9185760.459288
M3-0.5264502388535092.046272-0.25730.7980910.399046
M4-0.931733545647562.041733-0.45630.6502440.325122
M5-0.489016852441622.037719-0.240.8113870.405694
M6-0.310300159235672.034234-0.15250.8794150.439707
M7-0.03958346602972642.03128-0.01950.9845350.492268
M8-0.7748667728237812.028861-0.38190.7042410.35212
M9-0.330150079617842.026977-0.16290.8713130.435656
M100.0945666135881082.025630.04670.9629620.481481
M110.3412833067940532.0248220.16850.8668740.433437
t-0.3347166932059450.033039-10.13100

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 97.0079108280255 & 1.751451 & 55.3872 & 0 & 0 \tabularnewline
X & 2.84945063694268 & 1.442362 & 1.9755 & 0.054095 & 0.027047 \tabularnewline
M1 & 0.688398221868353 & 1.943126 & 0.3543 & 0.724718 & 0.362359 \tabularnewline
M2 & 0.210833067940549 & 2.051333 & 0.1028 & 0.918576 & 0.459288 \tabularnewline
M3 & -0.526450238853509 & 2.046272 & -0.2573 & 0.798091 & 0.399046 \tabularnewline
M4 & -0.93173354564756 & 2.041733 & -0.4563 & 0.650244 & 0.325122 \tabularnewline
M5 & -0.48901685244162 & 2.037719 & -0.24 & 0.811387 & 0.405694 \tabularnewline
M6 & -0.31030015923567 & 2.034234 & -0.1525 & 0.879415 & 0.439707 \tabularnewline
M7 & -0.0395834660297264 & 2.03128 & -0.0195 & 0.984535 & 0.492268 \tabularnewline
M8 & -0.774866772823781 & 2.028861 & -0.3819 & 0.704241 & 0.35212 \tabularnewline
M9 & -0.33015007961784 & 2.026977 & -0.1629 & 0.871313 & 0.435656 \tabularnewline
M10 & 0.094566613588108 & 2.02563 & 0.0467 & 0.962962 & 0.481481 \tabularnewline
M11 & 0.341283306794053 & 2.024822 & 0.1685 & 0.866874 & 0.433437 \tabularnewline
t & -0.334716693205945 & 0.033039 & -10.131 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6092&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]97.0079108280255[/C][C]1.751451[/C][C]55.3872[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]X[/C][C]2.84945063694268[/C][C]1.442362[/C][C]1.9755[/C][C]0.054095[/C][C]0.027047[/C][/ROW]
[ROW][C]M1[/C][C]0.688398221868353[/C][C]1.943126[/C][C]0.3543[/C][C]0.724718[/C][C]0.362359[/C][/ROW]
[ROW][C]M2[/C][C]0.210833067940549[/C][C]2.051333[/C][C]0.1028[/C][C]0.918576[/C][C]0.459288[/C][/ROW]
[ROW][C]M3[/C][C]-0.526450238853509[/C][C]2.046272[/C][C]-0.2573[/C][C]0.798091[/C][C]0.399046[/C][/ROW]
[ROW][C]M4[/C][C]-0.93173354564756[/C][C]2.041733[/C][C]-0.4563[/C][C]0.650244[/C][C]0.325122[/C][/ROW]
[ROW][C]M5[/C][C]-0.48901685244162[/C][C]2.037719[/C][C]-0.24[/C][C]0.811387[/C][C]0.405694[/C][/ROW]
[ROW][C]M6[/C][C]-0.31030015923567[/C][C]2.034234[/C][C]-0.1525[/C][C]0.879415[/C][C]0.439707[/C][/ROW]
[ROW][C]M7[/C][C]-0.0395834660297264[/C][C]2.03128[/C][C]-0.0195[/C][C]0.984535[/C][C]0.492268[/C][/ROW]
[ROW][C]M8[/C][C]-0.774866772823781[/C][C]2.028861[/C][C]-0.3819[/C][C]0.704241[/C][C]0.35212[/C][/ROW]
[ROW][C]M9[/C][C]-0.33015007961784[/C][C]2.026977[/C][C]-0.1629[/C][C]0.871313[/C][C]0.435656[/C][/ROW]
[ROW][C]M10[/C][C]0.094566613588108[/C][C]2.02563[/C][C]0.0467[/C][C]0.962962[/C][C]0.481481[/C][/ROW]
[ROW][C]M11[/C][C]0.341283306794053[/C][C]2.024822[/C][C]0.1685[/C][C]0.866874[/C][C]0.433437[/C][/ROW]
[ROW][C]t[/C][C]-0.334716693205945[/C][C]0.033039[/C][C]-10.131[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6092&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6092&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)97.00791082802551.75145155.387200
X2.849450636942681.4423621.97550.0540950.027047
M10.6883982218683531.9431260.35430.7247180.362359
M20.2108330679405492.0513330.10280.9185760.459288
M3-0.5264502388535092.046272-0.25730.7980910.399046
M4-0.931733545647562.041733-0.45630.6502440.325122
M5-0.489016852441622.037719-0.240.8113870.405694
M6-0.310300159235672.034234-0.15250.8794150.439707
M7-0.03958346602972642.03128-0.01950.9845350.492268
M8-0.7748667728237812.028861-0.38190.7042410.35212
M9-0.330150079617842.026977-0.16290.8713130.435656
M100.0945666135881082.025630.04670.9629620.481481
M110.3412833067940532.0248220.16850.8668740.433437
t-0.3347166932059450.033039-10.13100







Multiple Linear Regression - Regression Statistics
Multiple R0.878609736788236
R-squared0.771955069579093
Adjusted R-squared0.70887881222863
F-TEST (value)12.2384412456491
F-TEST (DF numerator)13
F-TEST (DF denominator)47
p-value5.14712716892518e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation3.20109786680331
Sum Squared Residuals481.610294984076

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.878609736788236 \tabularnewline
R-squared & 0.771955069579093 \tabularnewline
Adjusted R-squared & 0.70887881222863 \tabularnewline
F-TEST (value) & 12.2384412456491 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 47 \tabularnewline
p-value & 5.14712716892518e-11 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 3.20109786680331 \tabularnewline
Sum Squared Residuals & 481.610294984076 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6092&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.878609736788236[/C][/ROW]
[ROW][C]R-squared[/C][C]0.771955069579093[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.70887881222863[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]12.2384412456491[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]47[/C][/ROW]
[ROW][C]p-value[/C][C]5.14712716892518e-11[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]3.20109786680331[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]481.610294984076[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6092&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6092&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.878609736788236
R-squared0.771955069579093
Adjusted R-squared0.70887881222863
F-TEST (value)12.2384412456491
F-TEST (DF numerator)13
F-TEST (DF denominator)47
p-value5.14712716892518e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation3.20109786680331
Sum Squared Residuals481.610294984076







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1103.497.3615923566886.03840764331207
2101.8796.54931050955415.32068949044586
3101.1195.47731050955415.63268949044586
498.4794.73731050955413.73268949044586
597.894.84531050955412.95468949044586
697.3794.68931050955412.68068949044587
797.2994.62531050955412.66468949044587
893.0693.5553105095541-0.495310509554137
992.3993.6653105095541-1.27531050955414
1093.7393.7553105095541-0.0253105095541347
1194.8193.66731050955411.14268949044586
1293.2492.99131050955410.248689490445852
1390.0993.3449920382165-3.25499203821655
1489.8692.5327101910828-2.67271019108281
1587.9291.4607101910828-3.5407101910828
1686.390.7207101910828-4.42071019108280
1786.590.8287101910828-4.32871019108279
1887.9390.6727101910828-2.7427101910828
1988.690.6087101910828-2.00871019108281
2090.0889.53871019108280.541289808917195
2188.8489.6487101910828-0.808710191082797
2287.9189.7387101910828-1.82871019108281
2388.3189.6507101910828-1.34071019108280
2487.7788.9747101910828-1.20471019108281
2586.1189.3283917197452-3.21839171974521
2682.888.5161098726115-5.71610987261147
2781.6587.4441098726115-5.79410987261146
2882.3686.7041098726115-4.34410987261147
2982.9186.8121098726115-3.90210987261147
3081.9986.6561098726115-4.66610987261147
3183.3286.5921098726115-3.27210987261147
3284.1285.5221098726115-1.40210987261146
3385.6685.63210987261150.0278901273885345
3486.6785.72210987261150.947890127388538
3585.3185.6341098726115-0.324109872611465
3685.1384.95810987261150.171890127388528
3786.685.3117914012741.28820859872612
3887.9284.49950955414013.42049044585987
3987.1983.42750955414013.76249044585987
4085.5682.68750955414012.87249044585987
4186.2182.79550955414013.41449044585987
4286.1682.63950955414013.52049044585987
4385.0482.57550955414012.46449044585988
4482.0181.50550955414010.504490445859873
4583.0581.61550955414011.43449044585987
4683.3481.70550955414011.63449044585987
4782.8781.61750955414011.25249044585987
4883.1880.94150955414012.23849044585987
4983.9781.29519108280252.67480891719746
5082.9883.3323598726115-0.35235987261146
5182.282.2603598726115-0.0603598726114599
5283.6881.52035987261152.15964012738854
5383.4981.62835987261151.86164012738853
5482.6881.47235987261151.20764012738854
5581.5681.40835987261150.151640127388538
5681.1980.33835987261150.851640127388531
5781.0780.44835987261150.62164012738853
5879.8180.5383598726115-0.728359872611464
5979.7280.4503598726115-0.730359872611468
6078.3279.7743598726115-1.45435987261147
6176.680.1280414012739-3.52804140127388

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 103.4 & 97.361592356688 & 6.03840764331207 \tabularnewline
2 & 101.87 & 96.5493105095541 & 5.32068949044586 \tabularnewline
3 & 101.11 & 95.4773105095541 & 5.63268949044586 \tabularnewline
4 & 98.47 & 94.7373105095541 & 3.73268949044586 \tabularnewline
5 & 97.8 & 94.8453105095541 & 2.95468949044586 \tabularnewline
6 & 97.37 & 94.6893105095541 & 2.68068949044587 \tabularnewline
7 & 97.29 & 94.6253105095541 & 2.66468949044587 \tabularnewline
8 & 93.06 & 93.5553105095541 & -0.495310509554137 \tabularnewline
9 & 92.39 & 93.6653105095541 & -1.27531050955414 \tabularnewline
10 & 93.73 & 93.7553105095541 & -0.0253105095541347 \tabularnewline
11 & 94.81 & 93.6673105095541 & 1.14268949044586 \tabularnewline
12 & 93.24 & 92.9913105095541 & 0.248689490445852 \tabularnewline
13 & 90.09 & 93.3449920382165 & -3.25499203821655 \tabularnewline
14 & 89.86 & 92.5327101910828 & -2.67271019108281 \tabularnewline
15 & 87.92 & 91.4607101910828 & -3.5407101910828 \tabularnewline
16 & 86.3 & 90.7207101910828 & -4.42071019108280 \tabularnewline
17 & 86.5 & 90.8287101910828 & -4.32871019108279 \tabularnewline
18 & 87.93 & 90.6727101910828 & -2.7427101910828 \tabularnewline
19 & 88.6 & 90.6087101910828 & -2.00871019108281 \tabularnewline
20 & 90.08 & 89.5387101910828 & 0.541289808917195 \tabularnewline
21 & 88.84 & 89.6487101910828 & -0.808710191082797 \tabularnewline
22 & 87.91 & 89.7387101910828 & -1.82871019108281 \tabularnewline
23 & 88.31 & 89.6507101910828 & -1.34071019108280 \tabularnewline
24 & 87.77 & 88.9747101910828 & -1.20471019108281 \tabularnewline
25 & 86.11 & 89.3283917197452 & -3.21839171974521 \tabularnewline
26 & 82.8 & 88.5161098726115 & -5.71610987261147 \tabularnewline
27 & 81.65 & 87.4441098726115 & -5.79410987261146 \tabularnewline
28 & 82.36 & 86.7041098726115 & -4.34410987261147 \tabularnewline
29 & 82.91 & 86.8121098726115 & -3.90210987261147 \tabularnewline
30 & 81.99 & 86.6561098726115 & -4.66610987261147 \tabularnewline
31 & 83.32 & 86.5921098726115 & -3.27210987261147 \tabularnewline
32 & 84.12 & 85.5221098726115 & -1.40210987261146 \tabularnewline
33 & 85.66 & 85.6321098726115 & 0.0278901273885345 \tabularnewline
34 & 86.67 & 85.7221098726115 & 0.947890127388538 \tabularnewline
35 & 85.31 & 85.6341098726115 & -0.324109872611465 \tabularnewline
36 & 85.13 & 84.9581098726115 & 0.171890127388528 \tabularnewline
37 & 86.6 & 85.311791401274 & 1.28820859872612 \tabularnewline
38 & 87.92 & 84.4995095541401 & 3.42049044585987 \tabularnewline
39 & 87.19 & 83.4275095541401 & 3.76249044585987 \tabularnewline
40 & 85.56 & 82.6875095541401 & 2.87249044585987 \tabularnewline
41 & 86.21 & 82.7955095541401 & 3.41449044585987 \tabularnewline
42 & 86.16 & 82.6395095541401 & 3.52049044585987 \tabularnewline
43 & 85.04 & 82.5755095541401 & 2.46449044585988 \tabularnewline
44 & 82.01 & 81.5055095541401 & 0.504490445859873 \tabularnewline
45 & 83.05 & 81.6155095541401 & 1.43449044585987 \tabularnewline
46 & 83.34 & 81.7055095541401 & 1.63449044585987 \tabularnewline
47 & 82.87 & 81.6175095541401 & 1.25249044585987 \tabularnewline
48 & 83.18 & 80.9415095541401 & 2.23849044585987 \tabularnewline
49 & 83.97 & 81.2951910828025 & 2.67480891719746 \tabularnewline
50 & 82.98 & 83.3323598726115 & -0.35235987261146 \tabularnewline
51 & 82.2 & 82.2603598726115 & -0.0603598726114599 \tabularnewline
52 & 83.68 & 81.5203598726115 & 2.15964012738854 \tabularnewline
53 & 83.49 & 81.6283598726115 & 1.86164012738853 \tabularnewline
54 & 82.68 & 81.4723598726115 & 1.20764012738854 \tabularnewline
55 & 81.56 & 81.4083598726115 & 0.151640127388538 \tabularnewline
56 & 81.19 & 80.3383598726115 & 0.851640127388531 \tabularnewline
57 & 81.07 & 80.4483598726115 & 0.62164012738853 \tabularnewline
58 & 79.81 & 80.5383598726115 & -0.728359872611464 \tabularnewline
59 & 79.72 & 80.4503598726115 & -0.730359872611468 \tabularnewline
60 & 78.32 & 79.7743598726115 & -1.45435987261147 \tabularnewline
61 & 76.6 & 80.1280414012739 & -3.52804140127388 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6092&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]103.4[/C][C]97.361592356688[/C][C]6.03840764331207[/C][/ROW]
[ROW][C]2[/C][C]101.87[/C][C]96.5493105095541[/C][C]5.32068949044586[/C][/ROW]
[ROW][C]3[/C][C]101.11[/C][C]95.4773105095541[/C][C]5.63268949044586[/C][/ROW]
[ROW][C]4[/C][C]98.47[/C][C]94.7373105095541[/C][C]3.73268949044586[/C][/ROW]
[ROW][C]5[/C][C]97.8[/C][C]94.8453105095541[/C][C]2.95468949044586[/C][/ROW]
[ROW][C]6[/C][C]97.37[/C][C]94.6893105095541[/C][C]2.68068949044587[/C][/ROW]
[ROW][C]7[/C][C]97.29[/C][C]94.6253105095541[/C][C]2.66468949044587[/C][/ROW]
[ROW][C]8[/C][C]93.06[/C][C]93.5553105095541[/C][C]-0.495310509554137[/C][/ROW]
[ROW][C]9[/C][C]92.39[/C][C]93.6653105095541[/C][C]-1.27531050955414[/C][/ROW]
[ROW][C]10[/C][C]93.73[/C][C]93.7553105095541[/C][C]-0.0253105095541347[/C][/ROW]
[ROW][C]11[/C][C]94.81[/C][C]93.6673105095541[/C][C]1.14268949044586[/C][/ROW]
[ROW][C]12[/C][C]93.24[/C][C]92.9913105095541[/C][C]0.248689490445852[/C][/ROW]
[ROW][C]13[/C][C]90.09[/C][C]93.3449920382165[/C][C]-3.25499203821655[/C][/ROW]
[ROW][C]14[/C][C]89.86[/C][C]92.5327101910828[/C][C]-2.67271019108281[/C][/ROW]
[ROW][C]15[/C][C]87.92[/C][C]91.4607101910828[/C][C]-3.5407101910828[/C][/ROW]
[ROW][C]16[/C][C]86.3[/C][C]90.7207101910828[/C][C]-4.42071019108280[/C][/ROW]
[ROW][C]17[/C][C]86.5[/C][C]90.8287101910828[/C][C]-4.32871019108279[/C][/ROW]
[ROW][C]18[/C][C]87.93[/C][C]90.6727101910828[/C][C]-2.7427101910828[/C][/ROW]
[ROW][C]19[/C][C]88.6[/C][C]90.6087101910828[/C][C]-2.00871019108281[/C][/ROW]
[ROW][C]20[/C][C]90.08[/C][C]89.5387101910828[/C][C]0.541289808917195[/C][/ROW]
[ROW][C]21[/C][C]88.84[/C][C]89.6487101910828[/C][C]-0.808710191082797[/C][/ROW]
[ROW][C]22[/C][C]87.91[/C][C]89.7387101910828[/C][C]-1.82871019108281[/C][/ROW]
[ROW][C]23[/C][C]88.31[/C][C]89.6507101910828[/C][C]-1.34071019108280[/C][/ROW]
[ROW][C]24[/C][C]87.77[/C][C]88.9747101910828[/C][C]-1.20471019108281[/C][/ROW]
[ROW][C]25[/C][C]86.11[/C][C]89.3283917197452[/C][C]-3.21839171974521[/C][/ROW]
[ROW][C]26[/C][C]82.8[/C][C]88.5161098726115[/C][C]-5.71610987261147[/C][/ROW]
[ROW][C]27[/C][C]81.65[/C][C]87.4441098726115[/C][C]-5.79410987261146[/C][/ROW]
[ROW][C]28[/C][C]82.36[/C][C]86.7041098726115[/C][C]-4.34410987261147[/C][/ROW]
[ROW][C]29[/C][C]82.91[/C][C]86.8121098726115[/C][C]-3.90210987261147[/C][/ROW]
[ROW][C]30[/C][C]81.99[/C][C]86.6561098726115[/C][C]-4.66610987261147[/C][/ROW]
[ROW][C]31[/C][C]83.32[/C][C]86.5921098726115[/C][C]-3.27210987261147[/C][/ROW]
[ROW][C]32[/C][C]84.12[/C][C]85.5221098726115[/C][C]-1.40210987261146[/C][/ROW]
[ROW][C]33[/C][C]85.66[/C][C]85.6321098726115[/C][C]0.0278901273885345[/C][/ROW]
[ROW][C]34[/C][C]86.67[/C][C]85.7221098726115[/C][C]0.947890127388538[/C][/ROW]
[ROW][C]35[/C][C]85.31[/C][C]85.6341098726115[/C][C]-0.324109872611465[/C][/ROW]
[ROW][C]36[/C][C]85.13[/C][C]84.9581098726115[/C][C]0.171890127388528[/C][/ROW]
[ROW][C]37[/C][C]86.6[/C][C]85.311791401274[/C][C]1.28820859872612[/C][/ROW]
[ROW][C]38[/C][C]87.92[/C][C]84.4995095541401[/C][C]3.42049044585987[/C][/ROW]
[ROW][C]39[/C][C]87.19[/C][C]83.4275095541401[/C][C]3.76249044585987[/C][/ROW]
[ROW][C]40[/C][C]85.56[/C][C]82.6875095541401[/C][C]2.87249044585987[/C][/ROW]
[ROW][C]41[/C][C]86.21[/C][C]82.7955095541401[/C][C]3.41449044585987[/C][/ROW]
[ROW][C]42[/C][C]86.16[/C][C]82.6395095541401[/C][C]3.52049044585987[/C][/ROW]
[ROW][C]43[/C][C]85.04[/C][C]82.5755095541401[/C][C]2.46449044585988[/C][/ROW]
[ROW][C]44[/C][C]82.01[/C][C]81.5055095541401[/C][C]0.504490445859873[/C][/ROW]
[ROW][C]45[/C][C]83.05[/C][C]81.6155095541401[/C][C]1.43449044585987[/C][/ROW]
[ROW][C]46[/C][C]83.34[/C][C]81.7055095541401[/C][C]1.63449044585987[/C][/ROW]
[ROW][C]47[/C][C]82.87[/C][C]81.6175095541401[/C][C]1.25249044585987[/C][/ROW]
[ROW][C]48[/C][C]83.18[/C][C]80.9415095541401[/C][C]2.23849044585987[/C][/ROW]
[ROW][C]49[/C][C]83.97[/C][C]81.2951910828025[/C][C]2.67480891719746[/C][/ROW]
[ROW][C]50[/C][C]82.98[/C][C]83.3323598726115[/C][C]-0.35235987261146[/C][/ROW]
[ROW][C]51[/C][C]82.2[/C][C]82.2603598726115[/C][C]-0.0603598726114599[/C][/ROW]
[ROW][C]52[/C][C]83.68[/C][C]81.5203598726115[/C][C]2.15964012738854[/C][/ROW]
[ROW][C]53[/C][C]83.49[/C][C]81.6283598726115[/C][C]1.86164012738853[/C][/ROW]
[ROW][C]54[/C][C]82.68[/C][C]81.4723598726115[/C][C]1.20764012738854[/C][/ROW]
[ROW][C]55[/C][C]81.56[/C][C]81.4083598726115[/C][C]0.151640127388538[/C][/ROW]
[ROW][C]56[/C][C]81.19[/C][C]80.3383598726115[/C][C]0.851640127388531[/C][/ROW]
[ROW][C]57[/C][C]81.07[/C][C]80.4483598726115[/C][C]0.62164012738853[/C][/ROW]
[ROW][C]58[/C][C]79.81[/C][C]80.5383598726115[/C][C]-0.728359872611464[/C][/ROW]
[ROW][C]59[/C][C]79.72[/C][C]80.4503598726115[/C][C]-0.730359872611468[/C][/ROW]
[ROW][C]60[/C][C]78.32[/C][C]79.7743598726115[/C][C]-1.45435987261147[/C][/ROW]
[ROW][C]61[/C][C]76.6[/C][C]80.1280414012739[/C][C]-3.52804140127388[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6092&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6092&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1103.497.3615923566886.03840764331207
2101.8796.54931050955415.32068949044586
3101.1195.47731050955415.63268949044586
498.4794.73731050955413.73268949044586
597.894.84531050955412.95468949044586
697.3794.68931050955412.68068949044587
797.2994.62531050955412.66468949044587
893.0693.5553105095541-0.495310509554137
992.3993.6653105095541-1.27531050955414
1093.7393.7553105095541-0.0253105095541347
1194.8193.66731050955411.14268949044586
1293.2492.99131050955410.248689490445852
1390.0993.3449920382165-3.25499203821655
1489.8692.5327101910828-2.67271019108281
1587.9291.4607101910828-3.5407101910828
1686.390.7207101910828-4.42071019108280
1786.590.8287101910828-4.32871019108279
1887.9390.6727101910828-2.7427101910828
1988.690.6087101910828-2.00871019108281
2090.0889.53871019108280.541289808917195
2188.8489.6487101910828-0.808710191082797
2287.9189.7387101910828-1.82871019108281
2388.3189.6507101910828-1.34071019108280
2487.7788.9747101910828-1.20471019108281
2586.1189.3283917197452-3.21839171974521
2682.888.5161098726115-5.71610987261147
2781.6587.4441098726115-5.79410987261146
2882.3686.7041098726115-4.34410987261147
2982.9186.8121098726115-3.90210987261147
3081.9986.6561098726115-4.66610987261147
3183.3286.5921098726115-3.27210987261147
3284.1285.5221098726115-1.40210987261146
3385.6685.63210987261150.0278901273885345
3486.6785.72210987261150.947890127388538
3585.3185.6341098726115-0.324109872611465
3685.1384.95810987261150.171890127388528
3786.685.3117914012741.28820859872612
3887.9284.49950955414013.42049044585987
3987.1983.42750955414013.76249044585987
4085.5682.68750955414012.87249044585987
4186.2182.79550955414013.41449044585987
4286.1682.63950955414013.52049044585987
4385.0482.57550955414012.46449044585988
4482.0181.50550955414010.504490445859873
4583.0581.61550955414011.43449044585987
4683.3481.70550955414011.63449044585987
4782.8781.61750955414011.25249044585987
4883.1880.94150955414012.23849044585987
4983.9781.29519108280252.67480891719746
5082.9883.3323598726115-0.35235987261146
5182.282.2603598726115-0.0603598726114599
5283.6881.52035987261152.15964012738854
5383.4981.62835987261151.86164012738853
5482.6881.47235987261151.20764012738854
5581.5681.40835987261150.151640127388538
5681.1980.33835987261150.851640127388531
5781.0780.44835987261150.62164012738853
5879.8180.5383598726115-0.728359872611464
5979.7280.4503598726115-0.730359872611468
6078.3279.7743598726115-1.45435987261147
6176.680.1280414012739-3.52804140127388



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')