Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 19 Nov 2008 12:03:08 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Nov/19/t12271215216xkpii0g5o213li.htm/, Retrieved Tue, 28 May 2024 05:20:35 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=25052, Retrieved Tue, 28 May 2024 05:20:35 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact208
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
F    D  [Multiple Regression] [Seatbelt Law - Q1] [2008-11-19 12:45:01] [82970caad4b026be9dd352fdec547fe4]
-    D    [Multiple Regression] [Seatbelt law - Q3...] [2008-11-19 18:53:10] [82970caad4b026be9dd352fdec547fe4]
F    D        [Multiple Regression] [Seatbelt law - Q3...] [2008-11-19 19:03:08] [c33ddd06d9ea3933c8ac89c0e74c9b3a] [Current]
Feedback Forum
2008-11-28 10:40:34 [Dana Molenberghs] [reply
Wanneer we 2-tail p-value kijken zien we dat deze allemaal zeer hoog zijn. Stel dat de alfa fout 5% is liggen deze waarden er allemaal boven. Het is dus duidelijk dat je bevinden te wijten zijn aan toeval.

Post a new message
Dataseries X:
10413	0
10709	0
10662	0
10570	0
10297	0
10635	0
10872	0
10296	0
10383	0
10431	0
10574	0
10653	0
10805	0
10872	0
10625	0
10407	0
10463	0
10556	0
10646	0
10702	0
11353	1
11346	1
11451	1
11964	1
12574	1
13031	1
13812	1
14544	1
14931	1
14886	1
16005	1
17064	1
15168	1
16050	1
15839	1
15137	1
14954	1
15648	1
15305	1
15579	1
16348	1
15928	1
16171	1
15937	1
15713	1
15594	1
15683	1
16438	1
17032	1
17696	1
17745	1
19394	1
20148	1
20108	1
18584	1
18441	1
18391	1
19178	1
18079	1
18483	1
19644	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ 193.190.124.24

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ 193.190.124.24 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25052&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ 193.190.124.24[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25052&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25052&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ 193.190.124.24







Multiple Linear Regression - Estimated Regression Equation
Goudkoers[t] = + 8213.3625 + 772.59375DrasticChange[t] + 597.173958333335M1[t] + 795.041666666667M2[t] + 675.209375M3[t] + 985.777083333334M4[t] + 1165.94479166667M5[t] + 992.7125M6[t] + 867.280208333334M7[t] + 741.247916666667M8[t] + 141.896875000000M9[t] + 301.664583333334M10[t] -51.3677083333331M11[t] + 158.432291666667t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Goudkoers[t] =  +  8213.3625 +  772.59375DrasticChange[t] +  597.173958333335M1[t] +  795.041666666667M2[t] +  675.209375M3[t] +  985.777083333334M4[t] +  1165.94479166667M5[t] +  992.7125M6[t] +  867.280208333334M7[t] +  741.247916666667M8[t] +  141.896875000000M9[t] +  301.664583333334M10[t] -51.3677083333331M11[t] +  158.432291666667t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25052&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Goudkoers[t] =  +  8213.3625 +  772.59375DrasticChange[t] +  597.173958333335M1[t] +  795.041666666667M2[t] +  675.209375M3[t] +  985.777083333334M4[t] +  1165.94479166667M5[t] +  992.7125M6[t] +  867.280208333334M7[t] +  741.247916666667M8[t] +  141.896875000000M9[t] +  301.664583333334M10[t] -51.3677083333331M11[t] +  158.432291666667t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25052&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25052&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Goudkoers[t] = + 8213.3625 + 772.59375DrasticChange[t] + 597.173958333335M1[t] + 795.041666666667M2[t] + 675.209375M3[t] + 985.777083333334M4[t] + 1165.94479166667M5[t] + 992.7125M6[t] + 867.280208333334M7[t] + 741.247916666667M8[t] + 141.896875000000M9[t] + 301.664583333334M10[t] -51.3677083333331M11[t] + 158.432291666667t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)8213.3625532.59089215.421500
DrasticChange772.59375488.4265091.58180.1204020.060201
M1597.173958333335621.1064460.96150.3412380.170619
M2795.041666666667651.8309741.21970.2286620.114331
M3675.209375650.9599481.03730.3049260.152463
M4985.777083333334650.3462061.51580.1362740.068137
M51165.94479166667649.9904771.79380.0792820.039641
M6992.7125649.8931831.52750.1333390.066669
M7867.280208333334650.0544421.33420.1885780.094289
M8741.247916666667650.4740611.13960.2602480.130124
M9141.896875000000648.5602650.21880.8277630.413881
M10301.664583333334647.9121630.46560.6436560.321828
M11-51.3677083333331647.522991-0.07930.9371070.468554
t158.43229166666712.96337612.221500

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 8213.3625 & 532.590892 & 15.4215 & 0 & 0 \tabularnewline
DrasticChange & 772.59375 & 488.426509 & 1.5818 & 0.120402 & 0.060201 \tabularnewline
M1 & 597.173958333335 & 621.106446 & 0.9615 & 0.341238 & 0.170619 \tabularnewline
M2 & 795.041666666667 & 651.830974 & 1.2197 & 0.228662 & 0.114331 \tabularnewline
M3 & 675.209375 & 650.959948 & 1.0373 & 0.304926 & 0.152463 \tabularnewline
M4 & 985.777083333334 & 650.346206 & 1.5158 & 0.136274 & 0.068137 \tabularnewline
M5 & 1165.94479166667 & 649.990477 & 1.7938 & 0.079282 & 0.039641 \tabularnewline
M6 & 992.7125 & 649.893183 & 1.5275 & 0.133339 & 0.066669 \tabularnewline
M7 & 867.280208333334 & 650.054442 & 1.3342 & 0.188578 & 0.094289 \tabularnewline
M8 & 741.247916666667 & 650.474061 & 1.1396 & 0.260248 & 0.130124 \tabularnewline
M9 & 141.896875000000 & 648.560265 & 0.2188 & 0.827763 & 0.413881 \tabularnewline
M10 & 301.664583333334 & 647.912163 & 0.4656 & 0.643656 & 0.321828 \tabularnewline
M11 & -51.3677083333331 & 647.522991 & -0.0793 & 0.937107 & 0.468554 \tabularnewline
t & 158.432291666667 & 12.963376 & 12.2215 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25052&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]8213.3625[/C][C]532.590892[/C][C]15.4215[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]DrasticChange[/C][C]772.59375[/C][C]488.426509[/C][C]1.5818[/C][C]0.120402[/C][C]0.060201[/C][/ROW]
[ROW][C]M1[/C][C]597.173958333335[/C][C]621.106446[/C][C]0.9615[/C][C]0.341238[/C][C]0.170619[/C][/ROW]
[ROW][C]M2[/C][C]795.041666666667[/C][C]651.830974[/C][C]1.2197[/C][C]0.228662[/C][C]0.114331[/C][/ROW]
[ROW][C]M3[/C][C]675.209375[/C][C]650.959948[/C][C]1.0373[/C][C]0.304926[/C][C]0.152463[/C][/ROW]
[ROW][C]M4[/C][C]985.777083333334[/C][C]650.346206[/C][C]1.5158[/C][C]0.136274[/C][C]0.068137[/C][/ROW]
[ROW][C]M5[/C][C]1165.94479166667[/C][C]649.990477[/C][C]1.7938[/C][C]0.079282[/C][C]0.039641[/C][/ROW]
[ROW][C]M6[/C][C]992.7125[/C][C]649.893183[/C][C]1.5275[/C][C]0.133339[/C][C]0.066669[/C][/ROW]
[ROW][C]M7[/C][C]867.280208333334[/C][C]650.054442[/C][C]1.3342[/C][C]0.188578[/C][C]0.094289[/C][/ROW]
[ROW][C]M8[/C][C]741.247916666667[/C][C]650.474061[/C][C]1.1396[/C][C]0.260248[/C][C]0.130124[/C][/ROW]
[ROW][C]M9[/C][C]141.896875000000[/C][C]648.560265[/C][C]0.2188[/C][C]0.827763[/C][C]0.413881[/C][/ROW]
[ROW][C]M10[/C][C]301.664583333334[/C][C]647.912163[/C][C]0.4656[/C][C]0.643656[/C][C]0.321828[/C][/ROW]
[ROW][C]M11[/C][C]-51.3677083333331[/C][C]647.522991[/C][C]-0.0793[/C][C]0.937107[/C][C]0.468554[/C][/ROW]
[ROW][C]t[/C][C]158.432291666667[/C][C]12.963376[/C][C]12.2215[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25052&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25052&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)8213.3625532.59089215.421500
DrasticChange772.59375488.4265091.58180.1204020.060201
M1597.173958333335621.1064460.96150.3412380.170619
M2795.041666666667651.8309741.21970.2286620.114331
M3675.209375650.9599481.03730.3049260.152463
M4985.777083333334650.3462061.51580.1362740.068137
M51165.94479166667649.9904771.79380.0792820.039641
M6992.7125649.8931831.52750.1333390.066669
M7867.280208333334650.0544421.33420.1885780.094289
M8741.247916666667650.4740611.13960.2602480.130124
M9141.896875000000648.5602650.21880.8277630.413881
M10301.664583333334647.9121630.46560.6436560.321828
M11-51.3677083333331647.522991-0.07930.9371070.468554
t158.43229166666712.96337612.221500







Multiple Linear Regression - Regression Statistics
Multiple R0.959503406781167
R-squared0.920646787624666
Adjusted R-squared0.898698026754893
F-TEST (value)41.945273953599
F-TEST (DF numerator)13
F-TEST (DF denominator)47
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1023.61855058467
Sum Squared Residuals49246362.04375

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.959503406781167 \tabularnewline
R-squared & 0.920646787624666 \tabularnewline
Adjusted R-squared & 0.898698026754893 \tabularnewline
F-TEST (value) & 41.945273953599 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 47 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 1023.61855058467 \tabularnewline
Sum Squared Residuals & 49246362.04375 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25052&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.959503406781167[/C][/ROW]
[ROW][C]R-squared[/C][C]0.920646787624666[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.898698026754893[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]41.945273953599[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]47[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]1023.61855058467[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]49246362.04375[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25052&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25052&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.959503406781167
R-squared0.920646787624666
Adjusted R-squared0.898698026754893
F-TEST (value)41.945273953599
F-TEST (DF numerator)13
F-TEST (DF denominator)47
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1023.61855058467
Sum Squared Residuals49246362.04375







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1104138968.968751444.03125000001
2107099325.268751383.73125
3106629363.868751298.13125
4105709832.86875737.13125
51029710171.46875125.531249999999
61063510156.66875478.33125
71087210189.66875682.33125
81029610222.0687573.9312499999993
9103839781.15601.849999999999
101043110099.35331.649999999999
11105749904.75669.25
121065310114.55538.45
131080510870.15625-65.1562500000023
141087211226.45625-354.456250000002
151062511265.05625-640.056250000001
161040711734.05625-1327.05625
171046312072.65625-1609.65625
181055612057.85625-1501.85625
191064612090.85625-1444.85625
201070212123.25625-1421.25625
211135312454.93125-1101.93125
221134612773.13125-1427.13125
231145112578.53125-1127.53125
241196412788.33125-824.33125
251257413543.9375-969.937500000002
261303113900.2375-869.2375
271381213938.8375-126.837500000000
281454414407.8375136.162500000000
291493114746.4375184.562500000000
301488614731.6375154.3625
311600514764.63751240.3625
321706414797.03752266.9625
331516814356.11875811.88125
341605014674.318751375.68125
351583914479.718751359.28125
361513714689.51875447.48125
371495415445.125-491.125000000002
381564815801.425-153.425000000000
391530515840.025-535.025
401557916309.025-730.025
411634816647.625-299.625
421592816632.825-704.825
431617116665.825-494.825
441593716698.225-761.225
451571316257.30625-544.30625
461559416575.50625-981.50625
471568316380.90625-697.90625
481643816590.70625-152.706250000000
491703217346.3125-314.312500000002
501769617702.6125-6.61249999999961
511774517741.21253.78750000000127
521939418210.21251183.7875
532014818548.81251599.1875
542010818534.01251573.9875
551858418567.012516.9874999999998
561844118599.4125-158.412500000000
571839118158.49375232.50625
581917818476.69375701.30625
591807918282.09375-203.09375
601848318491.89375-8.8937499999991
611964419247.5396.500000000000

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 10413 & 8968.96875 & 1444.03125000001 \tabularnewline
2 & 10709 & 9325.26875 & 1383.73125 \tabularnewline
3 & 10662 & 9363.86875 & 1298.13125 \tabularnewline
4 & 10570 & 9832.86875 & 737.13125 \tabularnewline
5 & 10297 & 10171.46875 & 125.531249999999 \tabularnewline
6 & 10635 & 10156.66875 & 478.33125 \tabularnewline
7 & 10872 & 10189.66875 & 682.33125 \tabularnewline
8 & 10296 & 10222.06875 & 73.9312499999993 \tabularnewline
9 & 10383 & 9781.15 & 601.849999999999 \tabularnewline
10 & 10431 & 10099.35 & 331.649999999999 \tabularnewline
11 & 10574 & 9904.75 & 669.25 \tabularnewline
12 & 10653 & 10114.55 & 538.45 \tabularnewline
13 & 10805 & 10870.15625 & -65.1562500000023 \tabularnewline
14 & 10872 & 11226.45625 & -354.456250000002 \tabularnewline
15 & 10625 & 11265.05625 & -640.056250000001 \tabularnewline
16 & 10407 & 11734.05625 & -1327.05625 \tabularnewline
17 & 10463 & 12072.65625 & -1609.65625 \tabularnewline
18 & 10556 & 12057.85625 & -1501.85625 \tabularnewline
19 & 10646 & 12090.85625 & -1444.85625 \tabularnewline
20 & 10702 & 12123.25625 & -1421.25625 \tabularnewline
21 & 11353 & 12454.93125 & -1101.93125 \tabularnewline
22 & 11346 & 12773.13125 & -1427.13125 \tabularnewline
23 & 11451 & 12578.53125 & -1127.53125 \tabularnewline
24 & 11964 & 12788.33125 & -824.33125 \tabularnewline
25 & 12574 & 13543.9375 & -969.937500000002 \tabularnewline
26 & 13031 & 13900.2375 & -869.2375 \tabularnewline
27 & 13812 & 13938.8375 & -126.837500000000 \tabularnewline
28 & 14544 & 14407.8375 & 136.162500000000 \tabularnewline
29 & 14931 & 14746.4375 & 184.562500000000 \tabularnewline
30 & 14886 & 14731.6375 & 154.3625 \tabularnewline
31 & 16005 & 14764.6375 & 1240.3625 \tabularnewline
32 & 17064 & 14797.0375 & 2266.9625 \tabularnewline
33 & 15168 & 14356.11875 & 811.88125 \tabularnewline
34 & 16050 & 14674.31875 & 1375.68125 \tabularnewline
35 & 15839 & 14479.71875 & 1359.28125 \tabularnewline
36 & 15137 & 14689.51875 & 447.48125 \tabularnewline
37 & 14954 & 15445.125 & -491.125000000002 \tabularnewline
38 & 15648 & 15801.425 & -153.425000000000 \tabularnewline
39 & 15305 & 15840.025 & -535.025 \tabularnewline
40 & 15579 & 16309.025 & -730.025 \tabularnewline
41 & 16348 & 16647.625 & -299.625 \tabularnewline
42 & 15928 & 16632.825 & -704.825 \tabularnewline
43 & 16171 & 16665.825 & -494.825 \tabularnewline
44 & 15937 & 16698.225 & -761.225 \tabularnewline
45 & 15713 & 16257.30625 & -544.30625 \tabularnewline
46 & 15594 & 16575.50625 & -981.50625 \tabularnewline
47 & 15683 & 16380.90625 & -697.90625 \tabularnewline
48 & 16438 & 16590.70625 & -152.706250000000 \tabularnewline
49 & 17032 & 17346.3125 & -314.312500000002 \tabularnewline
50 & 17696 & 17702.6125 & -6.61249999999961 \tabularnewline
51 & 17745 & 17741.2125 & 3.78750000000127 \tabularnewline
52 & 19394 & 18210.2125 & 1183.7875 \tabularnewline
53 & 20148 & 18548.8125 & 1599.1875 \tabularnewline
54 & 20108 & 18534.0125 & 1573.9875 \tabularnewline
55 & 18584 & 18567.0125 & 16.9874999999998 \tabularnewline
56 & 18441 & 18599.4125 & -158.412500000000 \tabularnewline
57 & 18391 & 18158.49375 & 232.50625 \tabularnewline
58 & 19178 & 18476.69375 & 701.30625 \tabularnewline
59 & 18079 & 18282.09375 & -203.09375 \tabularnewline
60 & 18483 & 18491.89375 & -8.8937499999991 \tabularnewline
61 & 19644 & 19247.5 & 396.500000000000 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25052&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]10413[/C][C]8968.96875[/C][C]1444.03125000001[/C][/ROW]
[ROW][C]2[/C][C]10709[/C][C]9325.26875[/C][C]1383.73125[/C][/ROW]
[ROW][C]3[/C][C]10662[/C][C]9363.86875[/C][C]1298.13125[/C][/ROW]
[ROW][C]4[/C][C]10570[/C][C]9832.86875[/C][C]737.13125[/C][/ROW]
[ROW][C]5[/C][C]10297[/C][C]10171.46875[/C][C]125.531249999999[/C][/ROW]
[ROW][C]6[/C][C]10635[/C][C]10156.66875[/C][C]478.33125[/C][/ROW]
[ROW][C]7[/C][C]10872[/C][C]10189.66875[/C][C]682.33125[/C][/ROW]
[ROW][C]8[/C][C]10296[/C][C]10222.06875[/C][C]73.9312499999993[/C][/ROW]
[ROW][C]9[/C][C]10383[/C][C]9781.15[/C][C]601.849999999999[/C][/ROW]
[ROW][C]10[/C][C]10431[/C][C]10099.35[/C][C]331.649999999999[/C][/ROW]
[ROW][C]11[/C][C]10574[/C][C]9904.75[/C][C]669.25[/C][/ROW]
[ROW][C]12[/C][C]10653[/C][C]10114.55[/C][C]538.45[/C][/ROW]
[ROW][C]13[/C][C]10805[/C][C]10870.15625[/C][C]-65.1562500000023[/C][/ROW]
[ROW][C]14[/C][C]10872[/C][C]11226.45625[/C][C]-354.456250000002[/C][/ROW]
[ROW][C]15[/C][C]10625[/C][C]11265.05625[/C][C]-640.056250000001[/C][/ROW]
[ROW][C]16[/C][C]10407[/C][C]11734.05625[/C][C]-1327.05625[/C][/ROW]
[ROW][C]17[/C][C]10463[/C][C]12072.65625[/C][C]-1609.65625[/C][/ROW]
[ROW][C]18[/C][C]10556[/C][C]12057.85625[/C][C]-1501.85625[/C][/ROW]
[ROW][C]19[/C][C]10646[/C][C]12090.85625[/C][C]-1444.85625[/C][/ROW]
[ROW][C]20[/C][C]10702[/C][C]12123.25625[/C][C]-1421.25625[/C][/ROW]
[ROW][C]21[/C][C]11353[/C][C]12454.93125[/C][C]-1101.93125[/C][/ROW]
[ROW][C]22[/C][C]11346[/C][C]12773.13125[/C][C]-1427.13125[/C][/ROW]
[ROW][C]23[/C][C]11451[/C][C]12578.53125[/C][C]-1127.53125[/C][/ROW]
[ROW][C]24[/C][C]11964[/C][C]12788.33125[/C][C]-824.33125[/C][/ROW]
[ROW][C]25[/C][C]12574[/C][C]13543.9375[/C][C]-969.937500000002[/C][/ROW]
[ROW][C]26[/C][C]13031[/C][C]13900.2375[/C][C]-869.2375[/C][/ROW]
[ROW][C]27[/C][C]13812[/C][C]13938.8375[/C][C]-126.837500000000[/C][/ROW]
[ROW][C]28[/C][C]14544[/C][C]14407.8375[/C][C]136.162500000000[/C][/ROW]
[ROW][C]29[/C][C]14931[/C][C]14746.4375[/C][C]184.562500000000[/C][/ROW]
[ROW][C]30[/C][C]14886[/C][C]14731.6375[/C][C]154.3625[/C][/ROW]
[ROW][C]31[/C][C]16005[/C][C]14764.6375[/C][C]1240.3625[/C][/ROW]
[ROW][C]32[/C][C]17064[/C][C]14797.0375[/C][C]2266.9625[/C][/ROW]
[ROW][C]33[/C][C]15168[/C][C]14356.11875[/C][C]811.88125[/C][/ROW]
[ROW][C]34[/C][C]16050[/C][C]14674.31875[/C][C]1375.68125[/C][/ROW]
[ROW][C]35[/C][C]15839[/C][C]14479.71875[/C][C]1359.28125[/C][/ROW]
[ROW][C]36[/C][C]15137[/C][C]14689.51875[/C][C]447.48125[/C][/ROW]
[ROW][C]37[/C][C]14954[/C][C]15445.125[/C][C]-491.125000000002[/C][/ROW]
[ROW][C]38[/C][C]15648[/C][C]15801.425[/C][C]-153.425000000000[/C][/ROW]
[ROW][C]39[/C][C]15305[/C][C]15840.025[/C][C]-535.025[/C][/ROW]
[ROW][C]40[/C][C]15579[/C][C]16309.025[/C][C]-730.025[/C][/ROW]
[ROW][C]41[/C][C]16348[/C][C]16647.625[/C][C]-299.625[/C][/ROW]
[ROW][C]42[/C][C]15928[/C][C]16632.825[/C][C]-704.825[/C][/ROW]
[ROW][C]43[/C][C]16171[/C][C]16665.825[/C][C]-494.825[/C][/ROW]
[ROW][C]44[/C][C]15937[/C][C]16698.225[/C][C]-761.225[/C][/ROW]
[ROW][C]45[/C][C]15713[/C][C]16257.30625[/C][C]-544.30625[/C][/ROW]
[ROW][C]46[/C][C]15594[/C][C]16575.50625[/C][C]-981.50625[/C][/ROW]
[ROW][C]47[/C][C]15683[/C][C]16380.90625[/C][C]-697.90625[/C][/ROW]
[ROW][C]48[/C][C]16438[/C][C]16590.70625[/C][C]-152.706250000000[/C][/ROW]
[ROW][C]49[/C][C]17032[/C][C]17346.3125[/C][C]-314.312500000002[/C][/ROW]
[ROW][C]50[/C][C]17696[/C][C]17702.6125[/C][C]-6.61249999999961[/C][/ROW]
[ROW][C]51[/C][C]17745[/C][C]17741.2125[/C][C]3.78750000000127[/C][/ROW]
[ROW][C]52[/C][C]19394[/C][C]18210.2125[/C][C]1183.7875[/C][/ROW]
[ROW][C]53[/C][C]20148[/C][C]18548.8125[/C][C]1599.1875[/C][/ROW]
[ROW][C]54[/C][C]20108[/C][C]18534.0125[/C][C]1573.9875[/C][/ROW]
[ROW][C]55[/C][C]18584[/C][C]18567.0125[/C][C]16.9874999999998[/C][/ROW]
[ROW][C]56[/C][C]18441[/C][C]18599.4125[/C][C]-158.412500000000[/C][/ROW]
[ROW][C]57[/C][C]18391[/C][C]18158.49375[/C][C]232.50625[/C][/ROW]
[ROW][C]58[/C][C]19178[/C][C]18476.69375[/C][C]701.30625[/C][/ROW]
[ROW][C]59[/C][C]18079[/C][C]18282.09375[/C][C]-203.09375[/C][/ROW]
[ROW][C]60[/C][C]18483[/C][C]18491.89375[/C][C]-8.8937499999991[/C][/ROW]
[ROW][C]61[/C][C]19644[/C][C]19247.5[/C][C]396.500000000000[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25052&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25052&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1104138968.968751444.03125000001
2107099325.268751383.73125
3106629363.868751298.13125
4105709832.86875737.13125
51029710171.46875125.531249999999
61063510156.66875478.33125
71087210189.66875682.33125
81029610222.0687573.9312499999993
9103839781.15601.849999999999
101043110099.35331.649999999999
11105749904.75669.25
121065310114.55538.45
131080510870.15625-65.1562500000023
141087211226.45625-354.456250000002
151062511265.05625-640.056250000001
161040711734.05625-1327.05625
171046312072.65625-1609.65625
181055612057.85625-1501.85625
191064612090.85625-1444.85625
201070212123.25625-1421.25625
211135312454.93125-1101.93125
221134612773.13125-1427.13125
231145112578.53125-1127.53125
241196412788.33125-824.33125
251257413543.9375-969.937500000002
261303113900.2375-869.2375
271381213938.8375-126.837500000000
281454414407.8375136.162500000000
291493114746.4375184.562500000000
301488614731.6375154.3625
311600514764.63751240.3625
321706414797.03752266.9625
331516814356.11875811.88125
341605014674.318751375.68125
351583914479.718751359.28125
361513714689.51875447.48125
371495415445.125-491.125000000002
381564815801.425-153.425000000000
391530515840.025-535.025
401557916309.025-730.025
411634816647.625-299.625
421592816632.825-704.825
431617116665.825-494.825
441593716698.225-761.225
451571316257.30625-544.30625
461559416575.50625-981.50625
471568316380.90625-697.90625
481643816590.70625-152.706250000000
491703217346.3125-314.312500000002
501769617702.6125-6.61249999999961
511774517741.21253.78750000000127
521939418210.21251183.7875
532014818548.81251599.1875
542010818534.01251573.9875
551858418567.012516.9874999999998
561844118599.4125-158.412500000000
571839118158.49375232.50625
581917818476.69375701.30625
591807918282.09375-203.09375
601848318491.89375-8.8937499999991
611964419247.5396.500000000000



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')