Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 04:07:28 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t1195470082ujk9vofg49fmjz9.htm/, Retrieved Fri, 03 May 2024 11:19:49 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5704, Retrieved Fri, 03 May 2024 11:19:49 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordsgroep MENS
Estimated Impact211
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Seatbelt law_paper] [2007-11-19 11:07:28] [183840e644503a44411d430a3cdac4ba] [Current]
-   PD    [Multiple Regression] [Multiple Regressi...] [2008-12-10 20:09:17] [fd59abe368d8219a006d49608e51987e]
- R PD    [Multiple Regression] [Multiple Regressi...] [2008-12-10 20:18:27] [fd59abe368d8219a006d49608e51987e]
- R PD    [Multiple Regression] [Multiple Regression] [2008-12-10 20:22:53] [fd59abe368d8219a006d49608e51987e]
- R PD    [Multiple Regression] [Multiple Regression] [2008-12-10 20:28:26] [fd59abe368d8219a006d49608e51987e]
Feedback Forum

Post a new message
Dataseries X:
100,0	0
100,0	0
100,0	0
100,1	0
100,0	0
100,0	0
99,8	0
100,0	0
99,9	0
99,2	0
98,7	0
98,7	0
98,9	1
99,2	1
99,8	1
100,5	1
100,1	1
100,5	1
98,4	1
98,6	1
99,0	1
99,1	1
98,9	1
98,5	1
96,9	1
96,8	1
97,0	1
97,0	1
96,9	1
97,1	1
97,2	1
97,9	1
98,9	1
99,2	1
99,5	1
99,3	1
99,9	1
100,0	1
100,3	1
100,5	1
100,7	1
100,9	1
100,8	1
100,9	1
101,0	1
100,3	1
100,1	1
99,8	1
99,9	1
99,9	1
100,2	1
99,7	1
100,4	1
100,9	1
101,3	1
101,4	1
101,3	1
100,9	1
100,9	1
100,9	1
101,1	1
101,1	1
101,3	1
101,8	1
102,9	1
103,2	1
103,3	1
104,5	1
105,0	1
104,9	1
104,9	1
105,4	1
106,0	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5704&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5704&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5704&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 98.7584114583333 -2.8036328125x[t] + 0.496667751736103M1[t] + 0.0217339409722206M2[t] + 0.192893880208329M3[t] + 0.264053819444441M4[t] + 0.401880425347222M5[t] + 0.573040364583331M6[t] + 0.177533637152775M7[t] + 0.498693576388888M8[t] + 0.703186848958332M9[t] + 0.357680121527777M10[t] + 0.162173394097222M11[t] + 0.0955067274305556t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  98.7584114583333 -2.8036328125x[t] +  0.496667751736103M1[t] +  0.0217339409722206M2[t] +  0.192893880208329M3[t] +  0.264053819444441M4[t] +  0.401880425347222M5[t] +  0.573040364583331M6[t] +  0.177533637152775M7[t] +  0.498693576388888M8[t] +  0.703186848958332M9[t] +  0.357680121527777M10[t] +  0.162173394097222M11[t] +  0.0955067274305556t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5704&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  98.7584114583333 -2.8036328125x[t] +  0.496667751736103M1[t] +  0.0217339409722206M2[t] +  0.192893880208329M3[t] +  0.264053819444441M4[t] +  0.401880425347222M5[t] +  0.573040364583331M6[t] +  0.177533637152775M7[t] +  0.498693576388888M8[t] +  0.703186848958332M9[t] +  0.357680121527777M10[t] +  0.162173394097222M11[t] +  0.0955067274305556t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5704&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5704&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 98.7584114583333 -2.8036328125x[t] + 0.496667751736103M1[t] + 0.0217339409722206M2[t] + 0.192893880208329M3[t] + 0.264053819444441M4[t] + 0.401880425347222M5[t] + 0.573040364583331M6[t] + 0.177533637152775M7[t] + 0.498693576388888M8[t] + 0.703186848958332M9[t] + 0.357680121527777M10[t] + 0.162173394097222M11[t] + 0.0955067274305556t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)98.75841145833330.666595148.153600
x-2.80363281250.560988-4.99775e-063e-06
M10.4966677517361030.7543120.65840.5128180.256409
M20.02173394097222060.7866970.02760.9780530.489027
M30.1928938802083290.7854960.24560.8068680.403434
M40.2640538194444410.7844210.33660.7375960.368798
M50.4018804253472220.783470.51290.6099020.304951
M60.5730403645833310.7826450.73220.4669550.233477
M70.1775336371527750.7819470.2270.8211770.410589
M80.4986935763888880.7813750.63820.5257960.262898
M90.7031868489583320.780930.90040.3715430.185771
M100.3576801215277770.7806120.45820.6484880.324244
M110.1621733940972220.7804210.20780.8360990.418049
t0.09550672743055560.0099679.581900

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 98.7584114583333 & 0.666595 & 148.1536 & 0 & 0 \tabularnewline
x & -2.8036328125 & 0.560988 & -4.9977 & 5e-06 & 3e-06 \tabularnewline
M1 & 0.496667751736103 & 0.754312 & 0.6584 & 0.512818 & 0.256409 \tabularnewline
M2 & 0.0217339409722206 & 0.786697 & 0.0276 & 0.978053 & 0.489027 \tabularnewline
M3 & 0.192893880208329 & 0.785496 & 0.2456 & 0.806868 & 0.403434 \tabularnewline
M4 & 0.264053819444441 & 0.784421 & 0.3366 & 0.737596 & 0.368798 \tabularnewline
M5 & 0.401880425347222 & 0.78347 & 0.5129 & 0.609902 & 0.304951 \tabularnewline
M6 & 0.573040364583331 & 0.782645 & 0.7322 & 0.466955 & 0.233477 \tabularnewline
M7 & 0.177533637152775 & 0.781947 & 0.227 & 0.821177 & 0.410589 \tabularnewline
M8 & 0.498693576388888 & 0.781375 & 0.6382 & 0.525796 & 0.262898 \tabularnewline
M9 & 0.703186848958332 & 0.78093 & 0.9004 & 0.371543 & 0.185771 \tabularnewline
M10 & 0.357680121527777 & 0.780612 & 0.4582 & 0.648488 & 0.324244 \tabularnewline
M11 & 0.162173394097222 & 0.780421 & 0.2078 & 0.836099 & 0.418049 \tabularnewline
t & 0.0955067274305556 & 0.009967 & 9.5819 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5704&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]98.7584114583333[/C][C]0.666595[/C][C]148.1536[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]-2.8036328125[/C][C]0.560988[/C][C]-4.9977[/C][C]5e-06[/C][C]3e-06[/C][/ROW]
[ROW][C]M1[/C][C]0.496667751736103[/C][C]0.754312[/C][C]0.6584[/C][C]0.512818[/C][C]0.256409[/C][/ROW]
[ROW][C]M2[/C][C]0.0217339409722206[/C][C]0.786697[/C][C]0.0276[/C][C]0.978053[/C][C]0.489027[/C][/ROW]
[ROW][C]M3[/C][C]0.192893880208329[/C][C]0.785496[/C][C]0.2456[/C][C]0.806868[/C][C]0.403434[/C][/ROW]
[ROW][C]M4[/C][C]0.264053819444441[/C][C]0.784421[/C][C]0.3366[/C][C]0.737596[/C][C]0.368798[/C][/ROW]
[ROW][C]M5[/C][C]0.401880425347222[/C][C]0.78347[/C][C]0.5129[/C][C]0.609902[/C][C]0.304951[/C][/ROW]
[ROW][C]M6[/C][C]0.573040364583331[/C][C]0.782645[/C][C]0.7322[/C][C]0.466955[/C][C]0.233477[/C][/ROW]
[ROW][C]M7[/C][C]0.177533637152775[/C][C]0.781947[/C][C]0.227[/C][C]0.821177[/C][C]0.410589[/C][/ROW]
[ROW][C]M8[/C][C]0.498693576388888[/C][C]0.781375[/C][C]0.6382[/C][C]0.525796[/C][C]0.262898[/C][/ROW]
[ROW][C]M9[/C][C]0.703186848958332[/C][C]0.78093[/C][C]0.9004[/C][C]0.371543[/C][C]0.185771[/C][/ROW]
[ROW][C]M10[/C][C]0.357680121527777[/C][C]0.780612[/C][C]0.4582[/C][C]0.648488[/C][C]0.324244[/C][/ROW]
[ROW][C]M11[/C][C]0.162173394097222[/C][C]0.780421[/C][C]0.2078[/C][C]0.836099[/C][C]0.418049[/C][/ROW]
[ROW][C]t[/C][C]0.0955067274305556[/C][C]0.009967[/C][C]9.5819[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5704&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5704&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)98.75841145833330.666595148.153600
x-2.80363281250.560988-4.99775e-063e-06
M10.4966677517361030.7543120.65840.5128180.256409
M20.02173394097222060.7866970.02760.9780530.489027
M30.1928938802083290.7854960.24560.8068680.403434
M40.2640538194444410.7844210.33660.7375960.368798
M50.4018804253472220.783470.51290.6099020.304951
M60.5730403645833310.7826450.73220.4669550.233477
M70.1775336371527750.7819470.2270.8211770.410589
M80.4986935763888880.7813750.63820.5257960.262898
M90.7031868489583320.780930.90040.3715430.185771
M100.3576801215277770.7806120.45820.6484880.324244
M110.1621733940972220.7804210.20780.8360990.418049
t0.09550672743055560.0099679.581900







Multiple Linear Regression - Regression Statistics
Multiple R0.79267443661103
R-squared0.628332762456614
Adjusted R-squared0.546439981302987
F-TEST (value)7.67262698378614
F-TEST (DF numerator)13
F-TEST (DF denominator)59
p-value1.34614241975584e-08
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1.35161819898456
Sum Squared Residuals107.785433593750

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.79267443661103 \tabularnewline
R-squared & 0.628332762456614 \tabularnewline
Adjusted R-squared & 0.546439981302987 \tabularnewline
F-TEST (value) & 7.67262698378614 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 59 \tabularnewline
p-value & 1.34614241975584e-08 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 1.35161819898456 \tabularnewline
Sum Squared Residuals & 107.785433593750 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5704&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.79267443661103[/C][/ROW]
[ROW][C]R-squared[/C][C]0.628332762456614[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.546439981302987[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]7.67262698378614[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]59[/C][/ROW]
[ROW][C]p-value[/C][C]1.34614241975584e-08[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]1.35161819898456[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]107.785433593750[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5704&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5704&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.79267443661103
R-squared0.628332762456614
Adjusted R-squared0.546439981302987
F-TEST (value)7.67262698378614
F-TEST (DF numerator)13
F-TEST (DF denominator)59
p-value1.34614241975584e-08
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1.35161819898456
Sum Squared Residuals107.785433593750







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110099.35058593750.649414062499957
210098.97115885416671.02884114583334
310099.23782552083330.762174479166673
4100.199.40449218750.695507812499998
510099.63782552083330.362174479166669
610099.90449218750.0955078125000025
799.899.60449218750.195507812500000
8100100.021158854167-0.0211588541666628
999.9100.321158854167-0.42115885416666
1099.2100.071158854167-0.87115885416666
1198.799.9711588541667-1.27115885416666
1298.799.9044921875-1.20449218749999
1398.997.69303385416671.20696614583334
1499.297.31360677083331.88639322916667
1599.897.58027343752.2197265625
16100.597.74694010416672.75305989583333
17100.197.98027343752.11972656249999
18100.598.24694010416672.25305989583333
1998.497.94694010416670.45305989583334
2098.698.36360677083330.236393229166659
219998.66360677083330.336393229166666
2299.198.41360677083330.68639322916666
2398.998.31360677083330.58639322916667
2498.598.24694010416670.253059895833330
2596.998.8391145833333-1.93911458333332
2696.898.4596875-1.65968750000000
279798.7263541666667-1.72635416666667
289798.8930208333333-1.89302083333333
2996.999.1263541666667-2.22635416666666
3097.199.3930208333333-2.29302083333334
3197.299.0930208333333-1.89302083333333
3297.999.5096875-1.60968750000000
3398.999.8096875-0.909687499999996
3499.299.5596875-0.359687499999999
3599.599.45968750.0403124999999973
3699.399.3930208333333-0.0930208333333387
3799.999.9851953125-0.0851953124999897
3810099.60576822916670.394231770833332
39100.399.87243489583330.427565104166664
40100.5100.03910156250.4608984375
41100.7100.2724348958330.427565104166667
42100.9100.53910156250.360898437500003
43100.8100.23910156250.560898437499997
44100.9100.6557682291670.244231770833336
45101100.9557682291670.0442317708333315
46100.3100.705768229167-0.405768229166672
47100.1100.605768229167-0.505768229166676
4899.8100.5391015625-0.739101562500006
4999.9101.131276041667-1.23127604166666
5099.9100.751848958333-0.851848958333329
51100.2101.018515625-0.818515624999998
5299.7101.185182291667-1.48518229166666
53100.4101.418515625-1.01851562500000
54100.9101.685182291667-0.785182291666664
55101.3101.385182291667-0.0851822916666698
56101.4101.801848958333-0.401848958333332
57101.3102.101848958333-0.801848958333338
58100.9101.851848958333-0.95184895833333
59100.9101.751848958333-0.851848958333331
60100.9101.685182291667-0.785182291666665
61101.1102.277356770833-1.17735677083334
62101.1101.8979296875-0.797929687500008
63101.3102.164596354167-0.86459635416667
64101.8102.331263020833-0.531263020833337
65102.9102.5645963541670.335403645833335
66103.2102.8312630208330.368736979166666
67103.3102.5312630208330.768736979166662
68104.5102.94792968751.55207031250000
69105103.24792968751.75207031250000
70104.9102.99792968751.90207031250000
71104.9102.89792968752.00207031250000
72105.4102.8312630208332.56873697916667
73106103.42343752.57656250000000

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 100 & 99.3505859375 & 0.649414062499957 \tabularnewline
2 & 100 & 98.9711588541667 & 1.02884114583334 \tabularnewline
3 & 100 & 99.2378255208333 & 0.762174479166673 \tabularnewline
4 & 100.1 & 99.4044921875 & 0.695507812499998 \tabularnewline
5 & 100 & 99.6378255208333 & 0.362174479166669 \tabularnewline
6 & 100 & 99.9044921875 & 0.0955078125000025 \tabularnewline
7 & 99.8 & 99.6044921875 & 0.195507812500000 \tabularnewline
8 & 100 & 100.021158854167 & -0.0211588541666628 \tabularnewline
9 & 99.9 & 100.321158854167 & -0.42115885416666 \tabularnewline
10 & 99.2 & 100.071158854167 & -0.87115885416666 \tabularnewline
11 & 98.7 & 99.9711588541667 & -1.27115885416666 \tabularnewline
12 & 98.7 & 99.9044921875 & -1.20449218749999 \tabularnewline
13 & 98.9 & 97.6930338541667 & 1.20696614583334 \tabularnewline
14 & 99.2 & 97.3136067708333 & 1.88639322916667 \tabularnewline
15 & 99.8 & 97.5802734375 & 2.2197265625 \tabularnewline
16 & 100.5 & 97.7469401041667 & 2.75305989583333 \tabularnewline
17 & 100.1 & 97.9802734375 & 2.11972656249999 \tabularnewline
18 & 100.5 & 98.2469401041667 & 2.25305989583333 \tabularnewline
19 & 98.4 & 97.9469401041667 & 0.45305989583334 \tabularnewline
20 & 98.6 & 98.3636067708333 & 0.236393229166659 \tabularnewline
21 & 99 & 98.6636067708333 & 0.336393229166666 \tabularnewline
22 & 99.1 & 98.4136067708333 & 0.68639322916666 \tabularnewline
23 & 98.9 & 98.3136067708333 & 0.58639322916667 \tabularnewline
24 & 98.5 & 98.2469401041667 & 0.253059895833330 \tabularnewline
25 & 96.9 & 98.8391145833333 & -1.93911458333332 \tabularnewline
26 & 96.8 & 98.4596875 & -1.65968750000000 \tabularnewline
27 & 97 & 98.7263541666667 & -1.72635416666667 \tabularnewline
28 & 97 & 98.8930208333333 & -1.89302083333333 \tabularnewline
29 & 96.9 & 99.1263541666667 & -2.22635416666666 \tabularnewline
30 & 97.1 & 99.3930208333333 & -2.29302083333334 \tabularnewline
31 & 97.2 & 99.0930208333333 & -1.89302083333333 \tabularnewline
32 & 97.9 & 99.5096875 & -1.60968750000000 \tabularnewline
33 & 98.9 & 99.8096875 & -0.909687499999996 \tabularnewline
34 & 99.2 & 99.5596875 & -0.359687499999999 \tabularnewline
35 & 99.5 & 99.4596875 & 0.0403124999999973 \tabularnewline
36 & 99.3 & 99.3930208333333 & -0.0930208333333387 \tabularnewline
37 & 99.9 & 99.9851953125 & -0.0851953124999897 \tabularnewline
38 & 100 & 99.6057682291667 & 0.394231770833332 \tabularnewline
39 & 100.3 & 99.8724348958333 & 0.427565104166664 \tabularnewline
40 & 100.5 & 100.0391015625 & 0.4608984375 \tabularnewline
41 & 100.7 & 100.272434895833 & 0.427565104166667 \tabularnewline
42 & 100.9 & 100.5391015625 & 0.360898437500003 \tabularnewline
43 & 100.8 & 100.2391015625 & 0.560898437499997 \tabularnewline
44 & 100.9 & 100.655768229167 & 0.244231770833336 \tabularnewline
45 & 101 & 100.955768229167 & 0.0442317708333315 \tabularnewline
46 & 100.3 & 100.705768229167 & -0.405768229166672 \tabularnewline
47 & 100.1 & 100.605768229167 & -0.505768229166676 \tabularnewline
48 & 99.8 & 100.5391015625 & -0.739101562500006 \tabularnewline
49 & 99.9 & 101.131276041667 & -1.23127604166666 \tabularnewline
50 & 99.9 & 100.751848958333 & -0.851848958333329 \tabularnewline
51 & 100.2 & 101.018515625 & -0.818515624999998 \tabularnewline
52 & 99.7 & 101.185182291667 & -1.48518229166666 \tabularnewline
53 & 100.4 & 101.418515625 & -1.01851562500000 \tabularnewline
54 & 100.9 & 101.685182291667 & -0.785182291666664 \tabularnewline
55 & 101.3 & 101.385182291667 & -0.0851822916666698 \tabularnewline
56 & 101.4 & 101.801848958333 & -0.401848958333332 \tabularnewline
57 & 101.3 & 102.101848958333 & -0.801848958333338 \tabularnewline
58 & 100.9 & 101.851848958333 & -0.95184895833333 \tabularnewline
59 & 100.9 & 101.751848958333 & -0.851848958333331 \tabularnewline
60 & 100.9 & 101.685182291667 & -0.785182291666665 \tabularnewline
61 & 101.1 & 102.277356770833 & -1.17735677083334 \tabularnewline
62 & 101.1 & 101.8979296875 & -0.797929687500008 \tabularnewline
63 & 101.3 & 102.164596354167 & -0.86459635416667 \tabularnewline
64 & 101.8 & 102.331263020833 & -0.531263020833337 \tabularnewline
65 & 102.9 & 102.564596354167 & 0.335403645833335 \tabularnewline
66 & 103.2 & 102.831263020833 & 0.368736979166666 \tabularnewline
67 & 103.3 & 102.531263020833 & 0.768736979166662 \tabularnewline
68 & 104.5 & 102.9479296875 & 1.55207031250000 \tabularnewline
69 & 105 & 103.2479296875 & 1.75207031250000 \tabularnewline
70 & 104.9 & 102.9979296875 & 1.90207031250000 \tabularnewline
71 & 104.9 & 102.8979296875 & 2.00207031250000 \tabularnewline
72 & 105.4 & 102.831263020833 & 2.56873697916667 \tabularnewline
73 & 106 & 103.4234375 & 2.57656250000000 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5704&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]100[/C][C]99.3505859375[/C][C]0.649414062499957[/C][/ROW]
[ROW][C]2[/C][C]100[/C][C]98.9711588541667[/C][C]1.02884114583334[/C][/ROW]
[ROW][C]3[/C][C]100[/C][C]99.2378255208333[/C][C]0.762174479166673[/C][/ROW]
[ROW][C]4[/C][C]100.1[/C][C]99.4044921875[/C][C]0.695507812499998[/C][/ROW]
[ROW][C]5[/C][C]100[/C][C]99.6378255208333[/C][C]0.362174479166669[/C][/ROW]
[ROW][C]6[/C][C]100[/C][C]99.9044921875[/C][C]0.0955078125000025[/C][/ROW]
[ROW][C]7[/C][C]99.8[/C][C]99.6044921875[/C][C]0.195507812500000[/C][/ROW]
[ROW][C]8[/C][C]100[/C][C]100.021158854167[/C][C]-0.0211588541666628[/C][/ROW]
[ROW][C]9[/C][C]99.9[/C][C]100.321158854167[/C][C]-0.42115885416666[/C][/ROW]
[ROW][C]10[/C][C]99.2[/C][C]100.071158854167[/C][C]-0.87115885416666[/C][/ROW]
[ROW][C]11[/C][C]98.7[/C][C]99.9711588541667[/C][C]-1.27115885416666[/C][/ROW]
[ROW][C]12[/C][C]98.7[/C][C]99.9044921875[/C][C]-1.20449218749999[/C][/ROW]
[ROW][C]13[/C][C]98.9[/C][C]97.6930338541667[/C][C]1.20696614583334[/C][/ROW]
[ROW][C]14[/C][C]99.2[/C][C]97.3136067708333[/C][C]1.88639322916667[/C][/ROW]
[ROW][C]15[/C][C]99.8[/C][C]97.5802734375[/C][C]2.2197265625[/C][/ROW]
[ROW][C]16[/C][C]100.5[/C][C]97.7469401041667[/C][C]2.75305989583333[/C][/ROW]
[ROW][C]17[/C][C]100.1[/C][C]97.9802734375[/C][C]2.11972656249999[/C][/ROW]
[ROW][C]18[/C][C]100.5[/C][C]98.2469401041667[/C][C]2.25305989583333[/C][/ROW]
[ROW][C]19[/C][C]98.4[/C][C]97.9469401041667[/C][C]0.45305989583334[/C][/ROW]
[ROW][C]20[/C][C]98.6[/C][C]98.3636067708333[/C][C]0.236393229166659[/C][/ROW]
[ROW][C]21[/C][C]99[/C][C]98.6636067708333[/C][C]0.336393229166666[/C][/ROW]
[ROW][C]22[/C][C]99.1[/C][C]98.4136067708333[/C][C]0.68639322916666[/C][/ROW]
[ROW][C]23[/C][C]98.9[/C][C]98.3136067708333[/C][C]0.58639322916667[/C][/ROW]
[ROW][C]24[/C][C]98.5[/C][C]98.2469401041667[/C][C]0.253059895833330[/C][/ROW]
[ROW][C]25[/C][C]96.9[/C][C]98.8391145833333[/C][C]-1.93911458333332[/C][/ROW]
[ROW][C]26[/C][C]96.8[/C][C]98.4596875[/C][C]-1.65968750000000[/C][/ROW]
[ROW][C]27[/C][C]97[/C][C]98.7263541666667[/C][C]-1.72635416666667[/C][/ROW]
[ROW][C]28[/C][C]97[/C][C]98.8930208333333[/C][C]-1.89302083333333[/C][/ROW]
[ROW][C]29[/C][C]96.9[/C][C]99.1263541666667[/C][C]-2.22635416666666[/C][/ROW]
[ROW][C]30[/C][C]97.1[/C][C]99.3930208333333[/C][C]-2.29302083333334[/C][/ROW]
[ROW][C]31[/C][C]97.2[/C][C]99.0930208333333[/C][C]-1.89302083333333[/C][/ROW]
[ROW][C]32[/C][C]97.9[/C][C]99.5096875[/C][C]-1.60968750000000[/C][/ROW]
[ROW][C]33[/C][C]98.9[/C][C]99.8096875[/C][C]-0.909687499999996[/C][/ROW]
[ROW][C]34[/C][C]99.2[/C][C]99.5596875[/C][C]-0.359687499999999[/C][/ROW]
[ROW][C]35[/C][C]99.5[/C][C]99.4596875[/C][C]0.0403124999999973[/C][/ROW]
[ROW][C]36[/C][C]99.3[/C][C]99.3930208333333[/C][C]-0.0930208333333387[/C][/ROW]
[ROW][C]37[/C][C]99.9[/C][C]99.9851953125[/C][C]-0.0851953124999897[/C][/ROW]
[ROW][C]38[/C][C]100[/C][C]99.6057682291667[/C][C]0.394231770833332[/C][/ROW]
[ROW][C]39[/C][C]100.3[/C][C]99.8724348958333[/C][C]0.427565104166664[/C][/ROW]
[ROW][C]40[/C][C]100.5[/C][C]100.0391015625[/C][C]0.4608984375[/C][/ROW]
[ROW][C]41[/C][C]100.7[/C][C]100.272434895833[/C][C]0.427565104166667[/C][/ROW]
[ROW][C]42[/C][C]100.9[/C][C]100.5391015625[/C][C]0.360898437500003[/C][/ROW]
[ROW][C]43[/C][C]100.8[/C][C]100.2391015625[/C][C]0.560898437499997[/C][/ROW]
[ROW][C]44[/C][C]100.9[/C][C]100.655768229167[/C][C]0.244231770833336[/C][/ROW]
[ROW][C]45[/C][C]101[/C][C]100.955768229167[/C][C]0.0442317708333315[/C][/ROW]
[ROW][C]46[/C][C]100.3[/C][C]100.705768229167[/C][C]-0.405768229166672[/C][/ROW]
[ROW][C]47[/C][C]100.1[/C][C]100.605768229167[/C][C]-0.505768229166676[/C][/ROW]
[ROW][C]48[/C][C]99.8[/C][C]100.5391015625[/C][C]-0.739101562500006[/C][/ROW]
[ROW][C]49[/C][C]99.9[/C][C]101.131276041667[/C][C]-1.23127604166666[/C][/ROW]
[ROW][C]50[/C][C]99.9[/C][C]100.751848958333[/C][C]-0.851848958333329[/C][/ROW]
[ROW][C]51[/C][C]100.2[/C][C]101.018515625[/C][C]-0.818515624999998[/C][/ROW]
[ROW][C]52[/C][C]99.7[/C][C]101.185182291667[/C][C]-1.48518229166666[/C][/ROW]
[ROW][C]53[/C][C]100.4[/C][C]101.418515625[/C][C]-1.01851562500000[/C][/ROW]
[ROW][C]54[/C][C]100.9[/C][C]101.685182291667[/C][C]-0.785182291666664[/C][/ROW]
[ROW][C]55[/C][C]101.3[/C][C]101.385182291667[/C][C]-0.0851822916666698[/C][/ROW]
[ROW][C]56[/C][C]101.4[/C][C]101.801848958333[/C][C]-0.401848958333332[/C][/ROW]
[ROW][C]57[/C][C]101.3[/C][C]102.101848958333[/C][C]-0.801848958333338[/C][/ROW]
[ROW][C]58[/C][C]100.9[/C][C]101.851848958333[/C][C]-0.95184895833333[/C][/ROW]
[ROW][C]59[/C][C]100.9[/C][C]101.751848958333[/C][C]-0.851848958333331[/C][/ROW]
[ROW][C]60[/C][C]100.9[/C][C]101.685182291667[/C][C]-0.785182291666665[/C][/ROW]
[ROW][C]61[/C][C]101.1[/C][C]102.277356770833[/C][C]-1.17735677083334[/C][/ROW]
[ROW][C]62[/C][C]101.1[/C][C]101.8979296875[/C][C]-0.797929687500008[/C][/ROW]
[ROW][C]63[/C][C]101.3[/C][C]102.164596354167[/C][C]-0.86459635416667[/C][/ROW]
[ROW][C]64[/C][C]101.8[/C][C]102.331263020833[/C][C]-0.531263020833337[/C][/ROW]
[ROW][C]65[/C][C]102.9[/C][C]102.564596354167[/C][C]0.335403645833335[/C][/ROW]
[ROW][C]66[/C][C]103.2[/C][C]102.831263020833[/C][C]0.368736979166666[/C][/ROW]
[ROW][C]67[/C][C]103.3[/C][C]102.531263020833[/C][C]0.768736979166662[/C][/ROW]
[ROW][C]68[/C][C]104.5[/C][C]102.9479296875[/C][C]1.55207031250000[/C][/ROW]
[ROW][C]69[/C][C]105[/C][C]103.2479296875[/C][C]1.75207031250000[/C][/ROW]
[ROW][C]70[/C][C]104.9[/C][C]102.9979296875[/C][C]1.90207031250000[/C][/ROW]
[ROW][C]71[/C][C]104.9[/C][C]102.8979296875[/C][C]2.00207031250000[/C][/ROW]
[ROW][C]72[/C][C]105.4[/C][C]102.831263020833[/C][C]2.56873697916667[/C][/ROW]
[ROW][C]73[/C][C]106[/C][C]103.4234375[/C][C]2.57656250000000[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5704&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5704&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110099.35058593750.649414062499957
210098.97115885416671.02884114583334
310099.23782552083330.762174479166673
4100.199.40449218750.695507812499998
510099.63782552083330.362174479166669
610099.90449218750.0955078125000025
799.899.60449218750.195507812500000
8100100.021158854167-0.0211588541666628
999.9100.321158854167-0.42115885416666
1099.2100.071158854167-0.87115885416666
1198.799.9711588541667-1.27115885416666
1298.799.9044921875-1.20449218749999
1398.997.69303385416671.20696614583334
1499.297.31360677083331.88639322916667
1599.897.58027343752.2197265625
16100.597.74694010416672.75305989583333
17100.197.98027343752.11972656249999
18100.598.24694010416672.25305989583333
1998.497.94694010416670.45305989583334
2098.698.36360677083330.236393229166659
219998.66360677083330.336393229166666
2299.198.41360677083330.68639322916666
2398.998.31360677083330.58639322916667
2498.598.24694010416670.253059895833330
2596.998.8391145833333-1.93911458333332
2696.898.4596875-1.65968750000000
279798.7263541666667-1.72635416666667
289798.8930208333333-1.89302083333333
2996.999.1263541666667-2.22635416666666
3097.199.3930208333333-2.29302083333334
3197.299.0930208333333-1.89302083333333
3297.999.5096875-1.60968750000000
3398.999.8096875-0.909687499999996
3499.299.5596875-0.359687499999999
3599.599.45968750.0403124999999973
3699.399.3930208333333-0.0930208333333387
3799.999.9851953125-0.0851953124999897
3810099.60576822916670.394231770833332
39100.399.87243489583330.427565104166664
40100.5100.03910156250.4608984375
41100.7100.2724348958330.427565104166667
42100.9100.53910156250.360898437500003
43100.8100.23910156250.560898437499997
44100.9100.6557682291670.244231770833336
45101100.9557682291670.0442317708333315
46100.3100.705768229167-0.405768229166672
47100.1100.605768229167-0.505768229166676
4899.8100.5391015625-0.739101562500006
4999.9101.131276041667-1.23127604166666
5099.9100.751848958333-0.851848958333329
51100.2101.018515625-0.818515624999998
5299.7101.185182291667-1.48518229166666
53100.4101.418515625-1.01851562500000
54100.9101.685182291667-0.785182291666664
55101.3101.385182291667-0.0851822916666698
56101.4101.801848958333-0.401848958333332
57101.3102.101848958333-0.801848958333338
58100.9101.851848958333-0.95184895833333
59100.9101.751848958333-0.851848958333331
60100.9101.685182291667-0.785182291666665
61101.1102.277356770833-1.17735677083334
62101.1101.8979296875-0.797929687500008
63101.3102.164596354167-0.86459635416667
64101.8102.331263020833-0.531263020833337
65102.9102.5645963541670.335403645833335
66103.2102.8312630208330.368736979166666
67103.3102.5312630208330.768736979166662
68104.5102.94792968751.55207031250000
69105103.24792968751.75207031250000
70104.9102.99792968751.90207031250000
71104.9102.89792968752.00207031250000
72105.4102.8312630208332.56873697916667
73106103.42343752.57656250000000



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')