Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSun, 23 Nov 2008 13:07:48 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Nov/23/t1227470924pytkmqgx1m8l54p.htm/, Retrieved Sun, 19 May 2024 08:55:57 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=25332, Retrieved Sun, 19 May 2024 08:55:57 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact155
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
F     [Multiple Regression] [] [2007-11-19 19:55:31] [b731da8b544846036771bbf9bf2f34ce]
-    D  [Multiple Regression] [9/11 op prijs diesel] [2008-11-23 20:03:30] [8b0d202c3a0c4ea223fd8b8e731dacd8]
-   PD      [Multiple Regression] [9/11 en prijs die...] [2008-11-23 20:07:48] [9ba97de59bb4d2edf0cfeac4ca7d2b73] [Current]
Feedback Forum

Post a new message
Dataseries X:
0.84	0
0.76	0
0.77	0
0.76	0
0.77	0
0.78	0
0.79	0
0.78	0
0.76	0
0.78	1
0.76	1
0.74	1
0.73	1
0.72	1
0.71	1
0.73	1
0.75	1
0.75	1
0.72	1
0.72	1
0.72	1
0.74	1
0.78	1
0.74	1
0.74	1
0.75	1
0.78	1
0.81	1
0.75	1
0.7	1
0.71	1
0.71	1
0.73	1
0.74	1
0.74	1
0.75	1
0.74	1
0.74	1
0.73	1
0.76	1
0.8	1
0.83	1
0.81	1
0.83	1
0.88	1
0.89	1
0.93	1
0.91	1
0.9	1
0.86	1
0.88	1
0.93	1
0.98	1
0.97	1
1.03	1
1.06	1
1.06	1
1.09	1
1.04	1
1	1
1.04	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'George Udny Yule' @ 72.249.76.132

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'George Udny Yule' @ 72.249.76.132 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25332&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ 72.249.76.132[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25332&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25332&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'George Udny Yule' @ 72.249.76.132







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 0.76161379310345 -0.160982758620690x[t] + 0.0084152298850578M1[t] -0.0310385057471264M2[t] -0.0293543103448277M3[t] -0.0116701149425288M4[t] -0.00598591954022995M5[t] -0.0163017241379312M6[t] -0.0166175287356323M7[t] -0.0149333333333335M8[t] -0.0112491379310346M9[t] + 0.0326316091954023M10[t] + 0.0283158045977011M11[t] + 0.00631580459770114t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  0.76161379310345 -0.160982758620690x[t] +  0.0084152298850578M1[t] -0.0310385057471264M2[t] -0.0293543103448277M3[t] -0.0116701149425288M4[t] -0.00598591954022995M5[t] -0.0163017241379312M6[t] -0.0166175287356323M7[t] -0.0149333333333335M8[t] -0.0112491379310346M9[t] +  0.0326316091954023M10[t] +  0.0283158045977011M11[t] +  0.00631580459770114t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25332&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  0.76161379310345 -0.160982758620690x[t] +  0.0084152298850578M1[t] -0.0310385057471264M2[t] -0.0293543103448277M3[t] -0.0116701149425288M4[t] -0.00598591954022995M5[t] -0.0163017241379312M6[t] -0.0166175287356323M7[t] -0.0149333333333335M8[t] -0.0112491379310346M9[t] +  0.0326316091954023M10[t] +  0.0283158045977011M11[t] +  0.00631580459770114t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25332&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25332&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 0.76161379310345 -0.160982758620690x[t] + 0.0084152298850578M1[t] -0.0310385057471264M2[t] -0.0293543103448277M3[t] -0.0116701149425288M4[t] -0.00598591954022995M5[t] -0.0163017241379312M6[t] -0.0166175287356323M7[t] -0.0149333333333335M8[t] -0.0112491379310346M9[t] + 0.0326316091954023M10[t] + 0.0283158045977011M11[t] + 0.00631580459770114t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.761613793103450.03671620.743600
x-0.1609827586206900.029493-5.45842e-061e-06
M10.00841522988505780.0382890.21980.8269920.413496
M2-0.03103850574712640.040121-0.77360.4430310.221515
M3-0.02935431034482770.040092-0.73220.4677040.233852
M4-0.01167011494252880.040072-0.29120.772160.38608
M5-0.005985919540229950.040061-0.14940.881860.44093
M6-0.01630172413793120.040058-0.4070.6858870.342943
M7-0.01661752873563230.040063-0.41480.6801870.340093
M8-0.01493333333333350.040077-0.37260.711110.355555
M9-0.01124913793103460.0401-0.28050.7803020.390151
M100.03263160919540230.0398010.81990.4164250.208213
M110.02831580459770110.0397880.71170.4801860.240093
t0.006315804597701140.00058510.78900

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 0.76161379310345 & 0.036716 & 20.7436 & 0 & 0 \tabularnewline
x & -0.160982758620690 & 0.029493 & -5.4584 & 2e-06 & 1e-06 \tabularnewline
M1 & 0.0084152298850578 & 0.038289 & 0.2198 & 0.826992 & 0.413496 \tabularnewline
M2 & -0.0310385057471264 & 0.040121 & -0.7736 & 0.443031 & 0.221515 \tabularnewline
M3 & -0.0293543103448277 & 0.040092 & -0.7322 & 0.467704 & 0.233852 \tabularnewline
M4 & -0.0116701149425288 & 0.040072 & -0.2912 & 0.77216 & 0.38608 \tabularnewline
M5 & -0.00598591954022995 & 0.040061 & -0.1494 & 0.88186 & 0.44093 \tabularnewline
M6 & -0.0163017241379312 & 0.040058 & -0.407 & 0.685887 & 0.342943 \tabularnewline
M7 & -0.0166175287356323 & 0.040063 & -0.4148 & 0.680187 & 0.340093 \tabularnewline
M8 & -0.0149333333333335 & 0.040077 & -0.3726 & 0.71111 & 0.355555 \tabularnewline
M9 & -0.0112491379310346 & 0.0401 & -0.2805 & 0.780302 & 0.390151 \tabularnewline
M10 & 0.0326316091954023 & 0.039801 & 0.8199 & 0.416425 & 0.208213 \tabularnewline
M11 & 0.0283158045977011 & 0.039788 & 0.7117 & 0.480186 & 0.240093 \tabularnewline
t & 0.00631580459770114 & 0.000585 & 10.789 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25332&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]0.76161379310345[/C][C]0.036716[/C][C]20.7436[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]-0.160982758620690[/C][C]0.029493[/C][C]-5.4584[/C][C]2e-06[/C][C]1e-06[/C][/ROW]
[ROW][C]M1[/C][C]0.0084152298850578[/C][C]0.038289[/C][C]0.2198[/C][C]0.826992[/C][C]0.413496[/C][/ROW]
[ROW][C]M2[/C][C]-0.0310385057471264[/C][C]0.040121[/C][C]-0.7736[/C][C]0.443031[/C][C]0.221515[/C][/ROW]
[ROW][C]M3[/C][C]-0.0293543103448277[/C][C]0.040092[/C][C]-0.7322[/C][C]0.467704[/C][C]0.233852[/C][/ROW]
[ROW][C]M4[/C][C]-0.0116701149425288[/C][C]0.040072[/C][C]-0.2912[/C][C]0.77216[/C][C]0.38608[/C][/ROW]
[ROW][C]M5[/C][C]-0.00598591954022995[/C][C]0.040061[/C][C]-0.1494[/C][C]0.88186[/C][C]0.44093[/C][/ROW]
[ROW][C]M6[/C][C]-0.0163017241379312[/C][C]0.040058[/C][C]-0.407[/C][C]0.685887[/C][C]0.342943[/C][/ROW]
[ROW][C]M7[/C][C]-0.0166175287356323[/C][C]0.040063[/C][C]-0.4148[/C][C]0.680187[/C][C]0.340093[/C][/ROW]
[ROW][C]M8[/C][C]-0.0149333333333335[/C][C]0.040077[/C][C]-0.3726[/C][C]0.71111[/C][C]0.355555[/C][/ROW]
[ROW][C]M9[/C][C]-0.0112491379310346[/C][C]0.0401[/C][C]-0.2805[/C][C]0.780302[/C][C]0.390151[/C][/ROW]
[ROW][C]M10[/C][C]0.0326316091954023[/C][C]0.039801[/C][C]0.8199[/C][C]0.416425[/C][C]0.208213[/C][/ROW]
[ROW][C]M11[/C][C]0.0283158045977011[/C][C]0.039788[/C][C]0.7117[/C][C]0.480186[/C][C]0.240093[/C][/ROW]
[ROW][C]t[/C][C]0.00631580459770114[/C][C]0.000585[/C][C]10.789[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25332&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25332&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.761613793103450.03671620.743600
x-0.1609827586206900.029493-5.45842e-061e-06
M10.00841522988505780.0382890.21980.8269920.413496
M2-0.03103850574712640.040121-0.77360.4430310.221515
M3-0.02935431034482770.040092-0.73220.4677040.233852
M4-0.01167011494252880.040072-0.29120.772160.38608
M5-0.005985919540229950.040061-0.14940.881860.44093
M6-0.01630172413793120.040058-0.4070.6858870.342943
M7-0.01661752873563230.040063-0.41480.6801870.340093
M8-0.01493333333333350.040077-0.37260.711110.355555
M9-0.01124913793103460.0401-0.28050.7803020.390151
M100.03263160919540230.0398010.81990.4164250.208213
M110.02831580459770110.0397880.71170.4801860.240093
t0.006315804597701140.00058510.78900







Multiple Linear Regression - Regression Statistics
Multiple R0.855163567900384
R-squared0.731304727864114
Adjusted R-squared0.656984758975465
F-TEST (value)9.83994932720975
F-TEST (DF numerator)13
F-TEST (DF denominator)47
p-value1.83569648370963e-09
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.0629031210770387
Sum Squared Residuals0.185969724137932

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.855163567900384 \tabularnewline
R-squared & 0.731304727864114 \tabularnewline
Adjusted R-squared & 0.656984758975465 \tabularnewline
F-TEST (value) & 9.83994932720975 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 47 \tabularnewline
p-value & 1.83569648370963e-09 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.0629031210770387 \tabularnewline
Sum Squared Residuals & 0.185969724137932 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25332&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.855163567900384[/C][/ROW]
[ROW][C]R-squared[/C][C]0.731304727864114[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.656984758975465[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]9.83994932720975[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]47[/C][/ROW]
[ROW][C]p-value[/C][C]1.83569648370963e-09[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.0629031210770387[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]0.185969724137932[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25332&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25332&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.855163567900384
R-squared0.731304727864114
Adjusted R-squared0.656984758975465
F-TEST (value)9.83994932720975
F-TEST (DF numerator)13
F-TEST (DF denominator)47
p-value1.83569648370963e-09
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.0629031210770387
Sum Squared Residuals0.185969724137932







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
10.840.7763448275862050.0636551724137951
20.760.7432068965517240.0167931034482763
30.770.7512068965517240.0187931034482756
40.760.775206896551724-0.0152068965517243
50.770.787206896551724-0.0172068965517245
60.780.783206896551724-0.00320689655172441
70.790.7892068965517240.000793103448275535
80.780.797206896551724-0.0172068965517244
90.760.807206896551725-0.0472068965517245
100.780.6964206896551720.0835793103448276
110.760.6984206896551720.0615793103448276
120.740.6764206896551720.0635793103448275
130.730.6911517241379310.0388482758620685
140.720.6580137931034480.0619862068965519
150.710.6660137931034480.0439862068965518
160.730.6900137931034480.0399862068965517
170.750.7020137931034480.0479862068965518
180.750.6980137931034480.0519862068965517
190.720.7040137931034480.0159862068965518
200.720.7120137931034480.0079862068965517
210.720.722013793103448-0.00201379310344828
220.740.772210344827586-0.0322103448275863
230.780.7742103448275860.00578965517241375
240.740.752210344827586-0.0122103448275863
250.740.766941379310345-0.0269413793103453
260.750.7338034482758620.0161965517241377
270.780.7418034482758620.038196551724138
280.810.7658034482758620.044196551724138
290.750.777803448275862-0.027803448275862
300.70.773803448275862-0.073803448275862
310.710.779803448275862-0.069803448275862
320.710.787803448275862-0.077803448275862
330.730.797803448275862-0.067803448275862
340.740.848-0.108
350.740.85-0.11
360.750.828-0.078
370.740.842731034482759-0.102731034482759
380.740.809593103448276-0.069593103448276
390.730.817593103448276-0.0875931034482758
400.760.841593103448276-0.0815931034482758
410.80.853593103448276-0.0535931034482758
420.830.849593103448276-0.0195931034482758
430.810.855593103448276-0.0455931034482757
440.830.863593103448276-0.0335931034482758
450.880.8735931034482760.00640689655172419
460.890.923789655172414-0.0337896551724138
470.930.9257896551724140.00421034482758626
480.910.9037896551724140.00621034482758623
490.90.918520689655173-0.0185206896551727
500.860.88538275862069-0.0253827586206899
510.880.89338275862069-0.0133827586206895
520.930.917382758620690.0126172413793105
530.980.929382758620690.0506172413793105
540.970.925382758620690.0446172413793105
551.030.931382758620690.0986172413793105
561.060.939382758620690.120617241379311
571.060.949382758620690.110617241379311
581.090.9995793103448270.0904206896551725
591.041.001579310344830.0384206896551726
6010.9795793103448270.0204206896551725
611.040.9943103448275870.0456896551724135

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 0.84 & 0.776344827586205 & 0.0636551724137951 \tabularnewline
2 & 0.76 & 0.743206896551724 & 0.0167931034482763 \tabularnewline
3 & 0.77 & 0.751206896551724 & 0.0187931034482756 \tabularnewline
4 & 0.76 & 0.775206896551724 & -0.0152068965517243 \tabularnewline
5 & 0.77 & 0.787206896551724 & -0.0172068965517245 \tabularnewline
6 & 0.78 & 0.783206896551724 & -0.00320689655172441 \tabularnewline
7 & 0.79 & 0.789206896551724 & 0.000793103448275535 \tabularnewline
8 & 0.78 & 0.797206896551724 & -0.0172068965517244 \tabularnewline
9 & 0.76 & 0.807206896551725 & -0.0472068965517245 \tabularnewline
10 & 0.78 & 0.696420689655172 & 0.0835793103448276 \tabularnewline
11 & 0.76 & 0.698420689655172 & 0.0615793103448276 \tabularnewline
12 & 0.74 & 0.676420689655172 & 0.0635793103448275 \tabularnewline
13 & 0.73 & 0.691151724137931 & 0.0388482758620685 \tabularnewline
14 & 0.72 & 0.658013793103448 & 0.0619862068965519 \tabularnewline
15 & 0.71 & 0.666013793103448 & 0.0439862068965518 \tabularnewline
16 & 0.73 & 0.690013793103448 & 0.0399862068965517 \tabularnewline
17 & 0.75 & 0.702013793103448 & 0.0479862068965518 \tabularnewline
18 & 0.75 & 0.698013793103448 & 0.0519862068965517 \tabularnewline
19 & 0.72 & 0.704013793103448 & 0.0159862068965518 \tabularnewline
20 & 0.72 & 0.712013793103448 & 0.0079862068965517 \tabularnewline
21 & 0.72 & 0.722013793103448 & -0.00201379310344828 \tabularnewline
22 & 0.74 & 0.772210344827586 & -0.0322103448275863 \tabularnewline
23 & 0.78 & 0.774210344827586 & 0.00578965517241375 \tabularnewline
24 & 0.74 & 0.752210344827586 & -0.0122103448275863 \tabularnewline
25 & 0.74 & 0.766941379310345 & -0.0269413793103453 \tabularnewline
26 & 0.75 & 0.733803448275862 & 0.0161965517241377 \tabularnewline
27 & 0.78 & 0.741803448275862 & 0.038196551724138 \tabularnewline
28 & 0.81 & 0.765803448275862 & 0.044196551724138 \tabularnewline
29 & 0.75 & 0.777803448275862 & -0.027803448275862 \tabularnewline
30 & 0.7 & 0.773803448275862 & -0.073803448275862 \tabularnewline
31 & 0.71 & 0.779803448275862 & -0.069803448275862 \tabularnewline
32 & 0.71 & 0.787803448275862 & -0.077803448275862 \tabularnewline
33 & 0.73 & 0.797803448275862 & -0.067803448275862 \tabularnewline
34 & 0.74 & 0.848 & -0.108 \tabularnewline
35 & 0.74 & 0.85 & -0.11 \tabularnewline
36 & 0.75 & 0.828 & -0.078 \tabularnewline
37 & 0.74 & 0.842731034482759 & -0.102731034482759 \tabularnewline
38 & 0.74 & 0.809593103448276 & -0.069593103448276 \tabularnewline
39 & 0.73 & 0.817593103448276 & -0.0875931034482758 \tabularnewline
40 & 0.76 & 0.841593103448276 & -0.0815931034482758 \tabularnewline
41 & 0.8 & 0.853593103448276 & -0.0535931034482758 \tabularnewline
42 & 0.83 & 0.849593103448276 & -0.0195931034482758 \tabularnewline
43 & 0.81 & 0.855593103448276 & -0.0455931034482757 \tabularnewline
44 & 0.83 & 0.863593103448276 & -0.0335931034482758 \tabularnewline
45 & 0.88 & 0.873593103448276 & 0.00640689655172419 \tabularnewline
46 & 0.89 & 0.923789655172414 & -0.0337896551724138 \tabularnewline
47 & 0.93 & 0.925789655172414 & 0.00421034482758626 \tabularnewline
48 & 0.91 & 0.903789655172414 & 0.00621034482758623 \tabularnewline
49 & 0.9 & 0.918520689655173 & -0.0185206896551727 \tabularnewline
50 & 0.86 & 0.88538275862069 & -0.0253827586206899 \tabularnewline
51 & 0.88 & 0.89338275862069 & -0.0133827586206895 \tabularnewline
52 & 0.93 & 0.91738275862069 & 0.0126172413793105 \tabularnewline
53 & 0.98 & 0.92938275862069 & 0.0506172413793105 \tabularnewline
54 & 0.97 & 0.92538275862069 & 0.0446172413793105 \tabularnewline
55 & 1.03 & 0.93138275862069 & 0.0986172413793105 \tabularnewline
56 & 1.06 & 0.93938275862069 & 0.120617241379311 \tabularnewline
57 & 1.06 & 0.94938275862069 & 0.110617241379311 \tabularnewline
58 & 1.09 & 0.999579310344827 & 0.0904206896551725 \tabularnewline
59 & 1.04 & 1.00157931034483 & 0.0384206896551726 \tabularnewline
60 & 1 & 0.979579310344827 & 0.0204206896551725 \tabularnewline
61 & 1.04 & 0.994310344827587 & 0.0456896551724135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25332&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]0.84[/C][C]0.776344827586205[/C][C]0.0636551724137951[/C][/ROW]
[ROW][C]2[/C][C]0.76[/C][C]0.743206896551724[/C][C]0.0167931034482763[/C][/ROW]
[ROW][C]3[/C][C]0.77[/C][C]0.751206896551724[/C][C]0.0187931034482756[/C][/ROW]
[ROW][C]4[/C][C]0.76[/C][C]0.775206896551724[/C][C]-0.0152068965517243[/C][/ROW]
[ROW][C]5[/C][C]0.77[/C][C]0.787206896551724[/C][C]-0.0172068965517245[/C][/ROW]
[ROW][C]6[/C][C]0.78[/C][C]0.783206896551724[/C][C]-0.00320689655172441[/C][/ROW]
[ROW][C]7[/C][C]0.79[/C][C]0.789206896551724[/C][C]0.000793103448275535[/C][/ROW]
[ROW][C]8[/C][C]0.78[/C][C]0.797206896551724[/C][C]-0.0172068965517244[/C][/ROW]
[ROW][C]9[/C][C]0.76[/C][C]0.807206896551725[/C][C]-0.0472068965517245[/C][/ROW]
[ROW][C]10[/C][C]0.78[/C][C]0.696420689655172[/C][C]0.0835793103448276[/C][/ROW]
[ROW][C]11[/C][C]0.76[/C][C]0.698420689655172[/C][C]0.0615793103448276[/C][/ROW]
[ROW][C]12[/C][C]0.74[/C][C]0.676420689655172[/C][C]0.0635793103448275[/C][/ROW]
[ROW][C]13[/C][C]0.73[/C][C]0.691151724137931[/C][C]0.0388482758620685[/C][/ROW]
[ROW][C]14[/C][C]0.72[/C][C]0.658013793103448[/C][C]0.0619862068965519[/C][/ROW]
[ROW][C]15[/C][C]0.71[/C][C]0.666013793103448[/C][C]0.0439862068965518[/C][/ROW]
[ROW][C]16[/C][C]0.73[/C][C]0.690013793103448[/C][C]0.0399862068965517[/C][/ROW]
[ROW][C]17[/C][C]0.75[/C][C]0.702013793103448[/C][C]0.0479862068965518[/C][/ROW]
[ROW][C]18[/C][C]0.75[/C][C]0.698013793103448[/C][C]0.0519862068965517[/C][/ROW]
[ROW][C]19[/C][C]0.72[/C][C]0.704013793103448[/C][C]0.0159862068965518[/C][/ROW]
[ROW][C]20[/C][C]0.72[/C][C]0.712013793103448[/C][C]0.0079862068965517[/C][/ROW]
[ROW][C]21[/C][C]0.72[/C][C]0.722013793103448[/C][C]-0.00201379310344828[/C][/ROW]
[ROW][C]22[/C][C]0.74[/C][C]0.772210344827586[/C][C]-0.0322103448275863[/C][/ROW]
[ROW][C]23[/C][C]0.78[/C][C]0.774210344827586[/C][C]0.00578965517241375[/C][/ROW]
[ROW][C]24[/C][C]0.74[/C][C]0.752210344827586[/C][C]-0.0122103448275863[/C][/ROW]
[ROW][C]25[/C][C]0.74[/C][C]0.766941379310345[/C][C]-0.0269413793103453[/C][/ROW]
[ROW][C]26[/C][C]0.75[/C][C]0.733803448275862[/C][C]0.0161965517241377[/C][/ROW]
[ROW][C]27[/C][C]0.78[/C][C]0.741803448275862[/C][C]0.038196551724138[/C][/ROW]
[ROW][C]28[/C][C]0.81[/C][C]0.765803448275862[/C][C]0.044196551724138[/C][/ROW]
[ROW][C]29[/C][C]0.75[/C][C]0.777803448275862[/C][C]-0.027803448275862[/C][/ROW]
[ROW][C]30[/C][C]0.7[/C][C]0.773803448275862[/C][C]-0.073803448275862[/C][/ROW]
[ROW][C]31[/C][C]0.71[/C][C]0.779803448275862[/C][C]-0.069803448275862[/C][/ROW]
[ROW][C]32[/C][C]0.71[/C][C]0.787803448275862[/C][C]-0.077803448275862[/C][/ROW]
[ROW][C]33[/C][C]0.73[/C][C]0.797803448275862[/C][C]-0.067803448275862[/C][/ROW]
[ROW][C]34[/C][C]0.74[/C][C]0.848[/C][C]-0.108[/C][/ROW]
[ROW][C]35[/C][C]0.74[/C][C]0.85[/C][C]-0.11[/C][/ROW]
[ROW][C]36[/C][C]0.75[/C][C]0.828[/C][C]-0.078[/C][/ROW]
[ROW][C]37[/C][C]0.74[/C][C]0.842731034482759[/C][C]-0.102731034482759[/C][/ROW]
[ROW][C]38[/C][C]0.74[/C][C]0.809593103448276[/C][C]-0.069593103448276[/C][/ROW]
[ROW][C]39[/C][C]0.73[/C][C]0.817593103448276[/C][C]-0.0875931034482758[/C][/ROW]
[ROW][C]40[/C][C]0.76[/C][C]0.841593103448276[/C][C]-0.0815931034482758[/C][/ROW]
[ROW][C]41[/C][C]0.8[/C][C]0.853593103448276[/C][C]-0.0535931034482758[/C][/ROW]
[ROW][C]42[/C][C]0.83[/C][C]0.849593103448276[/C][C]-0.0195931034482758[/C][/ROW]
[ROW][C]43[/C][C]0.81[/C][C]0.855593103448276[/C][C]-0.0455931034482757[/C][/ROW]
[ROW][C]44[/C][C]0.83[/C][C]0.863593103448276[/C][C]-0.0335931034482758[/C][/ROW]
[ROW][C]45[/C][C]0.88[/C][C]0.873593103448276[/C][C]0.00640689655172419[/C][/ROW]
[ROW][C]46[/C][C]0.89[/C][C]0.923789655172414[/C][C]-0.0337896551724138[/C][/ROW]
[ROW][C]47[/C][C]0.93[/C][C]0.925789655172414[/C][C]0.00421034482758626[/C][/ROW]
[ROW][C]48[/C][C]0.91[/C][C]0.903789655172414[/C][C]0.00621034482758623[/C][/ROW]
[ROW][C]49[/C][C]0.9[/C][C]0.918520689655173[/C][C]-0.0185206896551727[/C][/ROW]
[ROW][C]50[/C][C]0.86[/C][C]0.88538275862069[/C][C]-0.0253827586206899[/C][/ROW]
[ROW][C]51[/C][C]0.88[/C][C]0.89338275862069[/C][C]-0.0133827586206895[/C][/ROW]
[ROW][C]52[/C][C]0.93[/C][C]0.91738275862069[/C][C]0.0126172413793105[/C][/ROW]
[ROW][C]53[/C][C]0.98[/C][C]0.92938275862069[/C][C]0.0506172413793105[/C][/ROW]
[ROW][C]54[/C][C]0.97[/C][C]0.92538275862069[/C][C]0.0446172413793105[/C][/ROW]
[ROW][C]55[/C][C]1.03[/C][C]0.93138275862069[/C][C]0.0986172413793105[/C][/ROW]
[ROW][C]56[/C][C]1.06[/C][C]0.93938275862069[/C][C]0.120617241379311[/C][/ROW]
[ROW][C]57[/C][C]1.06[/C][C]0.94938275862069[/C][C]0.110617241379311[/C][/ROW]
[ROW][C]58[/C][C]1.09[/C][C]0.999579310344827[/C][C]0.0904206896551725[/C][/ROW]
[ROW][C]59[/C][C]1.04[/C][C]1.00157931034483[/C][C]0.0384206896551726[/C][/ROW]
[ROW][C]60[/C][C]1[/C][C]0.979579310344827[/C][C]0.0204206896551725[/C][/ROW]
[ROW][C]61[/C][C]1.04[/C][C]0.994310344827587[/C][C]0.0456896551724135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25332&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25332&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
10.840.7763448275862050.0636551724137951
20.760.7432068965517240.0167931034482763
30.770.7512068965517240.0187931034482756
40.760.775206896551724-0.0152068965517243
50.770.787206896551724-0.0172068965517245
60.780.783206896551724-0.00320689655172441
70.790.7892068965517240.000793103448275535
80.780.797206896551724-0.0172068965517244
90.760.807206896551725-0.0472068965517245
100.780.6964206896551720.0835793103448276
110.760.6984206896551720.0615793103448276
120.740.6764206896551720.0635793103448275
130.730.6911517241379310.0388482758620685
140.720.6580137931034480.0619862068965519
150.710.6660137931034480.0439862068965518
160.730.6900137931034480.0399862068965517
170.750.7020137931034480.0479862068965518
180.750.6980137931034480.0519862068965517
190.720.7040137931034480.0159862068965518
200.720.7120137931034480.0079862068965517
210.720.722013793103448-0.00201379310344828
220.740.772210344827586-0.0322103448275863
230.780.7742103448275860.00578965517241375
240.740.752210344827586-0.0122103448275863
250.740.766941379310345-0.0269413793103453
260.750.7338034482758620.0161965517241377
270.780.7418034482758620.038196551724138
280.810.7658034482758620.044196551724138
290.750.777803448275862-0.027803448275862
300.70.773803448275862-0.073803448275862
310.710.779803448275862-0.069803448275862
320.710.787803448275862-0.077803448275862
330.730.797803448275862-0.067803448275862
340.740.848-0.108
350.740.85-0.11
360.750.828-0.078
370.740.842731034482759-0.102731034482759
380.740.809593103448276-0.069593103448276
390.730.817593103448276-0.0875931034482758
400.760.841593103448276-0.0815931034482758
410.80.853593103448276-0.0535931034482758
420.830.849593103448276-0.0195931034482758
430.810.855593103448276-0.0455931034482757
440.830.863593103448276-0.0335931034482758
450.880.8735931034482760.00640689655172419
460.890.923789655172414-0.0337896551724138
470.930.9257896551724140.00421034482758626
480.910.9037896551724140.00621034482758623
490.90.918520689655173-0.0185206896551727
500.860.88538275862069-0.0253827586206899
510.880.89338275862069-0.0133827586206895
520.930.917382758620690.0126172413793105
530.980.929382758620690.0506172413793105
540.970.925382758620690.0446172413793105
551.030.931382758620690.0986172413793105
561.060.939382758620690.120617241379311
571.060.949382758620690.110617241379311
581.090.9995793103448270.0904206896551725
591.041.001579310344830.0384206896551726
6010.9795793103448270.0204206896551725
611.040.9943103448275870.0456896551724135



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')