Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 20 Nov 2007 11:22:48 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/20/t1195582578mtt24c2r8m4alht.htm/, Retrieved Sun, 05 May 2024 09:21:11 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5805, Retrieved Sun, 05 May 2024 09:21:11 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordspaper, q3, W6, multiple regression
Estimated Impact280
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [paper (Q3 W6)] [2007-11-20 18:22:48] [bd7b8d7754bcf95ad80b21f541dc6b78] [Current]
Feedback Forum

Post a new message
Dataseries X:
95.90	96.92
96.06	96.06
96.31	96.59
96.34	96.67
96.49	97.27
96.22	96.38
96.53	96.47
96.50	96.05
96.77	96.76
96.66	96.51
96.58	96.55
96.63	95.97
97.06	97.00
97.73	97.46
98.01	97.90
97.76	98.42
97.49	98.54
97.77	99.00
97.96	98.94
98.23	99.02
98.51	100.07
98.19	98.72
98.37	98.73
98.31	98.04
98.60	99.08
98.97	99.22
99.11	99.57
99.64	100.44
100.03	100.84
99.98	100.75
100.32	100.49
100.44	99.98
100.51	99.96
101.00	99.76
100.88	100.11
100.55	99.79
100.83	100.29
101.51	101.12
102.16	102.65
102.39	102.71
102.54	103.39
102.85	102.80
103.47	102.07
103.57	102.15
103.69	101.21
103.50	101.27
103.47	101.86
103.45	101.65
103.48	101.94
103.93	102.62
103.89	102.71
104.40	103.39
104.79	104.51
104.77	104.09
105.13	104.29
105.26	104.57
104.96	105.39
104.75	105.15
105.01	106.13
105.15	105.46
105.20	106.47
105.77	106.62
105.78	106.52
106.26	108.04
106.13	107.15
106.12	107.32
106.57	107.76
106.44	107.26
106.54	107.89




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ 193.190.124.10:1001

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Herman Ole Andreas Wold' @ 193.190.124.10:1001 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5805&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Herman Ole Andreas Wold' @ 193.190.124.10:1001[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5805&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5805&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ 193.190.124.10:1001







Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 89.9393085089868 + 0.0538729697220403X[t] + 0.163315226142462t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  +  89.9393085089868 +  0.0538729697220403X[t] +  0.163315226142462t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5805&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  +  89.9393085089868 +  0.0538729697220403X[t] +  0.163315226142462t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5805&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5805&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 89.9393085089868 + 0.0538729697220403X[t] + 0.163315226142462t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)89.93930850898685.9169615.200300
X0.05387296972204030.0621680.86660.3893180.194659
t0.1633152261424620.0109614.900400

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 89.9393085089868 & 5.91696 & 15.2003 & 0 & 0 \tabularnewline
X & 0.0538729697220403 & 0.062168 & 0.8666 & 0.389318 & 0.194659 \tabularnewline
t & 0.163315226142462 & 0.01096 & 14.9004 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5805&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]89.9393085089868[/C][C]5.91696[/C][C]15.2003[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]X[/C][C]0.0538729697220403[/C][C]0.062168[/C][C]0.8666[/C][C]0.389318[/C][C]0.194659[/C][/ROW]
[ROW][C]t[/C][C]0.163315226142462[/C][C]0.01096[/C][C]14.9004[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5805&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5805&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)89.93930850898685.9169615.200300
X0.05387296972204030.0621680.86660.3893180.194659
t0.1633152261424620.0109614.900400







Multiple Linear Regression - Regression Statistics
Multiple R0.991872264789074
R-squared0.983810589657807
Adjusted R-squared0.98332000146562
F-TEST (value)2005.36948366148
F-TEST (DF numerator)2
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.450705785203678
Sum Squared Residuals13.4069565178602

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.991872264789074 \tabularnewline
R-squared & 0.983810589657807 \tabularnewline
Adjusted R-squared & 0.98332000146562 \tabularnewline
F-TEST (value) & 2005.36948366148 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 66 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.450705785203678 \tabularnewline
Sum Squared Residuals & 13.4069565178602 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5805&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.991872264789074[/C][/ROW]
[ROW][C]R-squared[/C][C]0.983810589657807[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.98332000146562[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]2005.36948366148[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]66[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.450705785203678[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]13.4069565178602[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5805&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5805&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.991872264789074
R-squared0.983810589657807
Adjusted R-squared0.98332000146562
F-TEST (value)2005.36948366148
F-TEST (DF numerator)2
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.450705785203678
Sum Squared Residuals13.4069565178602







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
195.995.32399196058940.576008039410576
296.0695.4409764327710.619023567229067
396.3195.6328443328660.677155667133924
496.3495.80046939658630.5395306034137
596.4995.9961084045620.493891595438005
696.2296.11147668765180.108523312348163
796.5396.27964048106930.25035951893072
896.596.42032905992850.079670940071514
996.7796.62189409457360.148105905426399
1096.6696.7717410782855-0.111741078285552
1196.5896.9372112232169-0.357211223216894
1296.6397.0692801269206-0.439280126920575
1397.0697.2880845118767-0.228084511876732
1497.7397.47618130409130.253818695908669
1598.0197.66320063691150.34679936308851
1697.7697.8545298073094-0.0945298073094129
1797.4998.0243097898185-0.53430978981853
1897.7798.2124065820331-0.442406582033129
1997.9698.3724894299923-0.412489429992271
2098.2398.5401144937125-0.310114493712486
2198.5198.759996338063-0.249996338063089
2298.1998.8505830550808-0.660583055080804
2398.3799.0144370109205-0.64443701092048
2498.3199.1405798879547-0.830579887954736
2598.699.3599230026081-0.759923002608128
2698.9799.5307804445117-0.560780444511671
2799.1199.7129512100568-0.602951210056846
2899.6499.9231359198575-0.283135919857482
29100.03100.108000333889-0.07800033388876
3099.98100.266466992756-0.286466992756235
31100.32100.415775246771-0.0957752467709772
32100.44100.551615258355-0.111615258355194
33100.51100.713853025103-0.203853025103208
34101100.8663936573010.133606342698733
35100.88101.048564422846-0.168564422846448
36100.55101.194640298678-0.644640298677855
37100.83101.384892009681-0.554892009681336
38101.51101.592921800693-0.0829218006930849
39102.16101.8386626705100.321337329489723
40102.39102.0052102748360.384789725163943
41102.54102.2051591203900.334840879610499
42102.85102.3366892943960.513310705604029
43103.47102.4606772526411.00932274735866
44103.57102.6283023163620.94169768363843
45103.69102.7409769509650.949023049034691
46103.5102.9075245552910.592475444708908
47103.47103.1026248335700.367375166430441
48103.45103.2546267360700.195373263929612
49103.48103.4335651234320.0464348765677601
50103.93103.6335139689860.296486031014313
51103.89103.8016777624030.0883222375968617
52104.4104.0016266079570.398373392043417
53104.79104.2252795601880.56472043981227
54104.77104.3659681390470.404031860953055
55105.13104.5400579591340.589942040866185
56105.26104.7184576167980.541542383201561
57104.96104.9259486781130.0340513218870146
58104.75105.076334391522-0.326334391522152
59105.01105.292445127992-0.282445127992208
60105.15105.419665464421-0.269665464420902
61105.2105.637392389983-0.437392389982628
62105.77105.808788561583-0.0387885615834027
63105.78105.966716490754-0.186716490753655
64106.26106.2119186308740.0480813691263848
65106.13106.327286913963-0.197286913963471
66106.12106.499760544959-0.37976054495867
67106.57106.686779877779-0.116779877778842
68106.44106.823158619060-0.383158619060279
69106.54107.020413816128-0.480413816127617

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 95.9 & 95.3239919605894 & 0.576008039410576 \tabularnewline
2 & 96.06 & 95.440976432771 & 0.619023567229067 \tabularnewline
3 & 96.31 & 95.632844332866 & 0.677155667133924 \tabularnewline
4 & 96.34 & 95.8004693965863 & 0.5395306034137 \tabularnewline
5 & 96.49 & 95.996108404562 & 0.493891595438005 \tabularnewline
6 & 96.22 & 96.1114766876518 & 0.108523312348163 \tabularnewline
7 & 96.53 & 96.2796404810693 & 0.25035951893072 \tabularnewline
8 & 96.5 & 96.4203290599285 & 0.079670940071514 \tabularnewline
9 & 96.77 & 96.6218940945736 & 0.148105905426399 \tabularnewline
10 & 96.66 & 96.7717410782855 & -0.111741078285552 \tabularnewline
11 & 96.58 & 96.9372112232169 & -0.357211223216894 \tabularnewline
12 & 96.63 & 97.0692801269206 & -0.439280126920575 \tabularnewline
13 & 97.06 & 97.2880845118767 & -0.228084511876732 \tabularnewline
14 & 97.73 & 97.4761813040913 & 0.253818695908669 \tabularnewline
15 & 98.01 & 97.6632006369115 & 0.34679936308851 \tabularnewline
16 & 97.76 & 97.8545298073094 & -0.0945298073094129 \tabularnewline
17 & 97.49 & 98.0243097898185 & -0.53430978981853 \tabularnewline
18 & 97.77 & 98.2124065820331 & -0.442406582033129 \tabularnewline
19 & 97.96 & 98.3724894299923 & -0.412489429992271 \tabularnewline
20 & 98.23 & 98.5401144937125 & -0.310114493712486 \tabularnewline
21 & 98.51 & 98.759996338063 & -0.249996338063089 \tabularnewline
22 & 98.19 & 98.8505830550808 & -0.660583055080804 \tabularnewline
23 & 98.37 & 99.0144370109205 & -0.64443701092048 \tabularnewline
24 & 98.31 & 99.1405798879547 & -0.830579887954736 \tabularnewline
25 & 98.6 & 99.3599230026081 & -0.759923002608128 \tabularnewline
26 & 98.97 & 99.5307804445117 & -0.560780444511671 \tabularnewline
27 & 99.11 & 99.7129512100568 & -0.602951210056846 \tabularnewline
28 & 99.64 & 99.9231359198575 & -0.283135919857482 \tabularnewline
29 & 100.03 & 100.108000333889 & -0.07800033388876 \tabularnewline
30 & 99.98 & 100.266466992756 & -0.286466992756235 \tabularnewline
31 & 100.32 & 100.415775246771 & -0.0957752467709772 \tabularnewline
32 & 100.44 & 100.551615258355 & -0.111615258355194 \tabularnewline
33 & 100.51 & 100.713853025103 & -0.203853025103208 \tabularnewline
34 & 101 & 100.866393657301 & 0.133606342698733 \tabularnewline
35 & 100.88 & 101.048564422846 & -0.168564422846448 \tabularnewline
36 & 100.55 & 101.194640298678 & -0.644640298677855 \tabularnewline
37 & 100.83 & 101.384892009681 & -0.554892009681336 \tabularnewline
38 & 101.51 & 101.592921800693 & -0.0829218006930849 \tabularnewline
39 & 102.16 & 101.838662670510 & 0.321337329489723 \tabularnewline
40 & 102.39 & 102.005210274836 & 0.384789725163943 \tabularnewline
41 & 102.54 & 102.205159120390 & 0.334840879610499 \tabularnewline
42 & 102.85 & 102.336689294396 & 0.513310705604029 \tabularnewline
43 & 103.47 & 102.460677252641 & 1.00932274735866 \tabularnewline
44 & 103.57 & 102.628302316362 & 0.94169768363843 \tabularnewline
45 & 103.69 & 102.740976950965 & 0.949023049034691 \tabularnewline
46 & 103.5 & 102.907524555291 & 0.592475444708908 \tabularnewline
47 & 103.47 & 103.102624833570 & 0.367375166430441 \tabularnewline
48 & 103.45 & 103.254626736070 & 0.195373263929612 \tabularnewline
49 & 103.48 & 103.433565123432 & 0.0464348765677601 \tabularnewline
50 & 103.93 & 103.633513968986 & 0.296486031014313 \tabularnewline
51 & 103.89 & 103.801677762403 & 0.0883222375968617 \tabularnewline
52 & 104.4 & 104.001626607957 & 0.398373392043417 \tabularnewline
53 & 104.79 & 104.225279560188 & 0.56472043981227 \tabularnewline
54 & 104.77 & 104.365968139047 & 0.404031860953055 \tabularnewline
55 & 105.13 & 104.540057959134 & 0.589942040866185 \tabularnewline
56 & 105.26 & 104.718457616798 & 0.541542383201561 \tabularnewline
57 & 104.96 & 104.925948678113 & 0.0340513218870146 \tabularnewline
58 & 104.75 & 105.076334391522 & -0.326334391522152 \tabularnewline
59 & 105.01 & 105.292445127992 & -0.282445127992208 \tabularnewline
60 & 105.15 & 105.419665464421 & -0.269665464420902 \tabularnewline
61 & 105.2 & 105.637392389983 & -0.437392389982628 \tabularnewline
62 & 105.77 & 105.808788561583 & -0.0387885615834027 \tabularnewline
63 & 105.78 & 105.966716490754 & -0.186716490753655 \tabularnewline
64 & 106.26 & 106.211918630874 & 0.0480813691263848 \tabularnewline
65 & 106.13 & 106.327286913963 & -0.197286913963471 \tabularnewline
66 & 106.12 & 106.499760544959 & -0.37976054495867 \tabularnewline
67 & 106.57 & 106.686779877779 & -0.116779877778842 \tabularnewline
68 & 106.44 & 106.823158619060 & -0.383158619060279 \tabularnewline
69 & 106.54 & 107.020413816128 & -0.480413816127617 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5805&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]95.9[/C][C]95.3239919605894[/C][C]0.576008039410576[/C][/ROW]
[ROW][C]2[/C][C]96.06[/C][C]95.440976432771[/C][C]0.619023567229067[/C][/ROW]
[ROW][C]3[/C][C]96.31[/C][C]95.632844332866[/C][C]0.677155667133924[/C][/ROW]
[ROW][C]4[/C][C]96.34[/C][C]95.8004693965863[/C][C]0.5395306034137[/C][/ROW]
[ROW][C]5[/C][C]96.49[/C][C]95.996108404562[/C][C]0.493891595438005[/C][/ROW]
[ROW][C]6[/C][C]96.22[/C][C]96.1114766876518[/C][C]0.108523312348163[/C][/ROW]
[ROW][C]7[/C][C]96.53[/C][C]96.2796404810693[/C][C]0.25035951893072[/C][/ROW]
[ROW][C]8[/C][C]96.5[/C][C]96.4203290599285[/C][C]0.079670940071514[/C][/ROW]
[ROW][C]9[/C][C]96.77[/C][C]96.6218940945736[/C][C]0.148105905426399[/C][/ROW]
[ROW][C]10[/C][C]96.66[/C][C]96.7717410782855[/C][C]-0.111741078285552[/C][/ROW]
[ROW][C]11[/C][C]96.58[/C][C]96.9372112232169[/C][C]-0.357211223216894[/C][/ROW]
[ROW][C]12[/C][C]96.63[/C][C]97.0692801269206[/C][C]-0.439280126920575[/C][/ROW]
[ROW][C]13[/C][C]97.06[/C][C]97.2880845118767[/C][C]-0.228084511876732[/C][/ROW]
[ROW][C]14[/C][C]97.73[/C][C]97.4761813040913[/C][C]0.253818695908669[/C][/ROW]
[ROW][C]15[/C][C]98.01[/C][C]97.6632006369115[/C][C]0.34679936308851[/C][/ROW]
[ROW][C]16[/C][C]97.76[/C][C]97.8545298073094[/C][C]-0.0945298073094129[/C][/ROW]
[ROW][C]17[/C][C]97.49[/C][C]98.0243097898185[/C][C]-0.53430978981853[/C][/ROW]
[ROW][C]18[/C][C]97.77[/C][C]98.2124065820331[/C][C]-0.442406582033129[/C][/ROW]
[ROW][C]19[/C][C]97.96[/C][C]98.3724894299923[/C][C]-0.412489429992271[/C][/ROW]
[ROW][C]20[/C][C]98.23[/C][C]98.5401144937125[/C][C]-0.310114493712486[/C][/ROW]
[ROW][C]21[/C][C]98.51[/C][C]98.759996338063[/C][C]-0.249996338063089[/C][/ROW]
[ROW][C]22[/C][C]98.19[/C][C]98.8505830550808[/C][C]-0.660583055080804[/C][/ROW]
[ROW][C]23[/C][C]98.37[/C][C]99.0144370109205[/C][C]-0.64443701092048[/C][/ROW]
[ROW][C]24[/C][C]98.31[/C][C]99.1405798879547[/C][C]-0.830579887954736[/C][/ROW]
[ROW][C]25[/C][C]98.6[/C][C]99.3599230026081[/C][C]-0.759923002608128[/C][/ROW]
[ROW][C]26[/C][C]98.97[/C][C]99.5307804445117[/C][C]-0.560780444511671[/C][/ROW]
[ROW][C]27[/C][C]99.11[/C][C]99.7129512100568[/C][C]-0.602951210056846[/C][/ROW]
[ROW][C]28[/C][C]99.64[/C][C]99.9231359198575[/C][C]-0.283135919857482[/C][/ROW]
[ROW][C]29[/C][C]100.03[/C][C]100.108000333889[/C][C]-0.07800033388876[/C][/ROW]
[ROW][C]30[/C][C]99.98[/C][C]100.266466992756[/C][C]-0.286466992756235[/C][/ROW]
[ROW][C]31[/C][C]100.32[/C][C]100.415775246771[/C][C]-0.0957752467709772[/C][/ROW]
[ROW][C]32[/C][C]100.44[/C][C]100.551615258355[/C][C]-0.111615258355194[/C][/ROW]
[ROW][C]33[/C][C]100.51[/C][C]100.713853025103[/C][C]-0.203853025103208[/C][/ROW]
[ROW][C]34[/C][C]101[/C][C]100.866393657301[/C][C]0.133606342698733[/C][/ROW]
[ROW][C]35[/C][C]100.88[/C][C]101.048564422846[/C][C]-0.168564422846448[/C][/ROW]
[ROW][C]36[/C][C]100.55[/C][C]101.194640298678[/C][C]-0.644640298677855[/C][/ROW]
[ROW][C]37[/C][C]100.83[/C][C]101.384892009681[/C][C]-0.554892009681336[/C][/ROW]
[ROW][C]38[/C][C]101.51[/C][C]101.592921800693[/C][C]-0.0829218006930849[/C][/ROW]
[ROW][C]39[/C][C]102.16[/C][C]101.838662670510[/C][C]0.321337329489723[/C][/ROW]
[ROW][C]40[/C][C]102.39[/C][C]102.005210274836[/C][C]0.384789725163943[/C][/ROW]
[ROW][C]41[/C][C]102.54[/C][C]102.205159120390[/C][C]0.334840879610499[/C][/ROW]
[ROW][C]42[/C][C]102.85[/C][C]102.336689294396[/C][C]0.513310705604029[/C][/ROW]
[ROW][C]43[/C][C]103.47[/C][C]102.460677252641[/C][C]1.00932274735866[/C][/ROW]
[ROW][C]44[/C][C]103.57[/C][C]102.628302316362[/C][C]0.94169768363843[/C][/ROW]
[ROW][C]45[/C][C]103.69[/C][C]102.740976950965[/C][C]0.949023049034691[/C][/ROW]
[ROW][C]46[/C][C]103.5[/C][C]102.907524555291[/C][C]0.592475444708908[/C][/ROW]
[ROW][C]47[/C][C]103.47[/C][C]103.102624833570[/C][C]0.367375166430441[/C][/ROW]
[ROW][C]48[/C][C]103.45[/C][C]103.254626736070[/C][C]0.195373263929612[/C][/ROW]
[ROW][C]49[/C][C]103.48[/C][C]103.433565123432[/C][C]0.0464348765677601[/C][/ROW]
[ROW][C]50[/C][C]103.93[/C][C]103.633513968986[/C][C]0.296486031014313[/C][/ROW]
[ROW][C]51[/C][C]103.89[/C][C]103.801677762403[/C][C]0.0883222375968617[/C][/ROW]
[ROW][C]52[/C][C]104.4[/C][C]104.001626607957[/C][C]0.398373392043417[/C][/ROW]
[ROW][C]53[/C][C]104.79[/C][C]104.225279560188[/C][C]0.56472043981227[/C][/ROW]
[ROW][C]54[/C][C]104.77[/C][C]104.365968139047[/C][C]0.404031860953055[/C][/ROW]
[ROW][C]55[/C][C]105.13[/C][C]104.540057959134[/C][C]0.589942040866185[/C][/ROW]
[ROW][C]56[/C][C]105.26[/C][C]104.718457616798[/C][C]0.541542383201561[/C][/ROW]
[ROW][C]57[/C][C]104.96[/C][C]104.925948678113[/C][C]0.0340513218870146[/C][/ROW]
[ROW][C]58[/C][C]104.75[/C][C]105.076334391522[/C][C]-0.326334391522152[/C][/ROW]
[ROW][C]59[/C][C]105.01[/C][C]105.292445127992[/C][C]-0.282445127992208[/C][/ROW]
[ROW][C]60[/C][C]105.15[/C][C]105.419665464421[/C][C]-0.269665464420902[/C][/ROW]
[ROW][C]61[/C][C]105.2[/C][C]105.637392389983[/C][C]-0.437392389982628[/C][/ROW]
[ROW][C]62[/C][C]105.77[/C][C]105.808788561583[/C][C]-0.0387885615834027[/C][/ROW]
[ROW][C]63[/C][C]105.78[/C][C]105.966716490754[/C][C]-0.186716490753655[/C][/ROW]
[ROW][C]64[/C][C]106.26[/C][C]106.211918630874[/C][C]0.0480813691263848[/C][/ROW]
[ROW][C]65[/C][C]106.13[/C][C]106.327286913963[/C][C]-0.197286913963471[/C][/ROW]
[ROW][C]66[/C][C]106.12[/C][C]106.499760544959[/C][C]-0.37976054495867[/C][/ROW]
[ROW][C]67[/C][C]106.57[/C][C]106.686779877779[/C][C]-0.116779877778842[/C][/ROW]
[ROW][C]68[/C][C]106.44[/C][C]106.823158619060[/C][C]-0.383158619060279[/C][/ROW]
[ROW][C]69[/C][C]106.54[/C][C]107.020413816128[/C][C]-0.480413816127617[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5805&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5805&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
195.995.32399196058940.576008039410576
296.0695.4409764327710.619023567229067
396.3195.6328443328660.677155667133924
496.3495.80046939658630.5395306034137
596.4995.9961084045620.493891595438005
696.2296.11147668765180.108523312348163
796.5396.27964048106930.25035951893072
896.596.42032905992850.079670940071514
996.7796.62189409457360.148105905426399
1096.6696.7717410782855-0.111741078285552
1196.5896.9372112232169-0.357211223216894
1296.6397.0692801269206-0.439280126920575
1397.0697.2880845118767-0.228084511876732
1497.7397.47618130409130.253818695908669
1598.0197.66320063691150.34679936308851
1697.7697.8545298073094-0.0945298073094129
1797.4998.0243097898185-0.53430978981853
1897.7798.2124065820331-0.442406582033129
1997.9698.3724894299923-0.412489429992271
2098.2398.5401144937125-0.310114493712486
2198.5198.759996338063-0.249996338063089
2298.1998.8505830550808-0.660583055080804
2398.3799.0144370109205-0.64443701092048
2498.3199.1405798879547-0.830579887954736
2598.699.3599230026081-0.759923002608128
2698.9799.5307804445117-0.560780444511671
2799.1199.7129512100568-0.602951210056846
2899.6499.9231359198575-0.283135919857482
29100.03100.108000333889-0.07800033388876
3099.98100.266466992756-0.286466992756235
31100.32100.415775246771-0.0957752467709772
32100.44100.551615258355-0.111615258355194
33100.51100.713853025103-0.203853025103208
34101100.8663936573010.133606342698733
35100.88101.048564422846-0.168564422846448
36100.55101.194640298678-0.644640298677855
37100.83101.384892009681-0.554892009681336
38101.51101.592921800693-0.0829218006930849
39102.16101.8386626705100.321337329489723
40102.39102.0052102748360.384789725163943
41102.54102.2051591203900.334840879610499
42102.85102.3366892943960.513310705604029
43103.47102.4606772526411.00932274735866
44103.57102.6283023163620.94169768363843
45103.69102.7409769509650.949023049034691
46103.5102.9075245552910.592475444708908
47103.47103.1026248335700.367375166430441
48103.45103.2546267360700.195373263929612
49103.48103.4335651234320.0464348765677601
50103.93103.6335139689860.296486031014313
51103.89103.8016777624030.0883222375968617
52104.4104.0016266079570.398373392043417
53104.79104.2252795601880.56472043981227
54104.77104.3659681390470.404031860953055
55105.13104.5400579591340.589942040866185
56105.26104.7184576167980.541542383201561
57104.96104.9259486781130.0340513218870146
58104.75105.076334391522-0.326334391522152
59105.01105.292445127992-0.282445127992208
60105.15105.419665464421-0.269665464420902
61105.2105.637392389983-0.437392389982628
62105.77105.808788561583-0.0387885615834027
63105.78105.966716490754-0.186716490753655
64106.26106.2119186308740.0480813691263848
65106.13106.327286913963-0.197286913963471
66106.12106.499760544959-0.37976054495867
67106.57106.686779877779-0.116779877778842
68106.44106.823158619060-0.383158619060279
69106.54107.020413816128-0.480413816127617



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')