Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 24 Nov 2008 06:45:00 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Nov/24/t1227534425og0npznp0ayyx6w.htm/, Retrieved Sat, 18 May 2024 17:35:41 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=25427, Retrieved Sat, 18 May 2024 17:35:41 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact186
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
F     [Multiple Regression] [export België ger...] [2007-11-18 15:14:24] [77b67056de5b91120f9ef0c52896823b]
F    D    [Multiple Regression] [Seatbelt law case] [2008-11-24 13:45:00] [4f61d9107c13aaeb903747e6b417c65a] [Current]
Feedback Forum
2008-12-01 16:19:47 [Stefan Temmerman] [reply
De student geeft opnieuw een heel uitgebreide uitleg. Deze is voor het merendeel juist. Er wordt gekeken naar significantie, seizoenaliteit, trend,… Ook wordt het model getoetst. Een opmerking is dat een dummy waarde enkel en alleen maar de waarden 0 en 1 mogen zijn. Het is ofwel van toepassing, ofwel niet. De waarde 2 die hier gebruikt wordt voor een grote groei, is nonsens. Er wordt hier ook gezegd dat we hier kijken naar enkel een 1-tailed p-value, omdat een goede economische periode de export niet zou beïnvloeden. Dit weet ik niet zeker, want ik kan me voorstellen dat soms het tegendeel ook van toepassing is, namelijk dat een goede economische periode een achteruitgang van de export kent.
De T-STAT wordt hier ook niet gebruikt, maar uit die waarden kunnen we besluiten dat het ok is. Ook de adjusted R² is hier significant. Het besluit of het model goed is of niet, ontbreekt ook. De gemiddelden zijn hier niet constant aan nul, en de normaalverdeling is vrij scheef. Dit wijst erop dat het model voor verbetering vatbaar is.
2008-12-02 05:50:08 [Sofie Mertens] [reply
De uitleg is zeer uitgebreid en duidelijk met eigen inzichten. De grafieken worden correct geïnterpreteerd. Positief is ook dat de data uitdrukkelijk vermeld worden. Een duidelijk besluit op het einde had er nog wel bij gemogen van mij.

Post a new message
Dataseries X:
12103	1
12989	1
11610	1
10206	1
11356	1
11307	1
12649	1
11947	1
11714	0
12193	1
11269	1
9097	1
12640	1
13040	1
11687	1
11192	1
11392	1
11793	1
13933	1
12778	1
11810	2
13698	2
11957	2
10724	2
13939	1
13980	2
13807	2
12974	1
12510	2
12934	2
14908	2
13772	2
13013	2
14050	2
11817	2
11593	2
14466	2
13616	2
14734	2
13881	2
13528	2
13584	2
16170	2
13261	2
14742	2
15487	2
13155	2
12621	2
15032	1
15452	1
15428	2
13106	2
14717	1
14180	1
16202	1
15036	1
15915	1
16468	1
14730	1
13705	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ 193.190.124.24

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ 193.190.124.24 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25427&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ 193.190.124.24[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25427&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25427&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ 193.190.124.24







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 9033.29419953596 -101.495359628771x[t] + 2865.40413766434M1[t] + 2990.73936581593M2[t] + 2574.47459396752M3[t] + 1298.41167826759M4[t] + 1652.84783449343M5[t] + 1637.48399071926M6[t] + 3575.92014694509M7[t] + 2087.95630317092M8[t] + 2093.59245939675M9[t] + 2979.92768754834M10[t] + 1111.96384377417M11[t] + 74.3638437741686t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  9033.29419953596 -101.495359628771x[t] +  2865.40413766434M1[t] +  2990.73936581593M2[t] +  2574.47459396752M3[t] +  1298.41167826759M4[t] +  1652.84783449343M5[t] +  1637.48399071926M6[t] +  3575.92014694509M7[t] +  2087.95630317092M8[t] +  2093.59245939675M9[t] +  2979.92768754834M10[t] +  1111.96384377417M11[t] +  74.3638437741686t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25427&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  9033.29419953596 -101.495359628771x[t] +  2865.40413766434M1[t] +  2990.73936581593M2[t] +  2574.47459396752M3[t] +  1298.41167826759M4[t] +  1652.84783449343M5[t] +  1637.48399071926M6[t] +  3575.92014694509M7[t] +  2087.95630317092M8[t] +  2093.59245939675M9[t] +  2979.92768754834M10[t] +  1111.96384377417M11[t] +  74.3638437741686t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25427&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25427&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 9033.29419953596 -101.495359628771x[t] + 2865.40413766434M1[t] + 2990.73936581593M2[t] + 2574.47459396752M3[t] + 1298.41167826759M4[t] + 1652.84783449343M5[t] + 1637.48399071926M6[t] + 3575.92014694509M7[t] + 2087.95630317092M8[t] + 2093.59245939675M9[t] + 2979.92768754834M10[t] + 1111.96384377417M11[t] + 74.3638437741686t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)9033.29419953596311.83472628.968200
x-101.495359628771132.805895-0.76420.4486280.224314
M12865.40413766434323.4893218.857800
M22990.73936581593320.9501239.318400
M32574.47459396752320.4357918.034300
M41298.41167826759320.2471664.05440.0001929.6e-05
M51652.84783449343319.9716765.16565e-063e-06
M61637.48399071926319.7470275.12126e-063e-06
M73575.92014694509319.57332611.189700
M82087.95630317092319.4506576.536100
M92093.59245939675319.3790796.555200
M102979.92768754834318.4676049.357100
M111111.96384377417318.3906753.49250.0010690.000534
t74.36384377416864.04115518.401600

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 9033.29419953596 & 311.834726 & 28.9682 & 0 & 0 \tabularnewline
x & -101.495359628771 & 132.805895 & -0.7642 & 0.448628 & 0.224314 \tabularnewline
M1 & 2865.40413766434 & 323.489321 & 8.8578 & 0 & 0 \tabularnewline
M2 & 2990.73936581593 & 320.950123 & 9.3184 & 0 & 0 \tabularnewline
M3 & 2574.47459396752 & 320.435791 & 8.0343 & 0 & 0 \tabularnewline
M4 & 1298.41167826759 & 320.247166 & 4.0544 & 0.000192 & 9.6e-05 \tabularnewline
M5 & 1652.84783449343 & 319.971676 & 5.1656 & 5e-06 & 3e-06 \tabularnewline
M6 & 1637.48399071926 & 319.747027 & 5.1212 & 6e-06 & 3e-06 \tabularnewline
M7 & 3575.92014694509 & 319.573326 & 11.1897 & 0 & 0 \tabularnewline
M8 & 2087.95630317092 & 319.450657 & 6.5361 & 0 & 0 \tabularnewline
M9 & 2093.59245939675 & 319.379079 & 6.5552 & 0 & 0 \tabularnewline
M10 & 2979.92768754834 & 318.467604 & 9.3571 & 0 & 0 \tabularnewline
M11 & 1111.96384377417 & 318.390675 & 3.4925 & 0.001069 & 0.000534 \tabularnewline
t & 74.3638437741686 & 4.041155 & 18.4016 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25427&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]9033.29419953596[/C][C]311.834726[/C][C]28.9682[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]-101.495359628771[/C][C]132.805895[/C][C]-0.7642[/C][C]0.448628[/C][C]0.224314[/C][/ROW]
[ROW][C]M1[/C][C]2865.40413766434[/C][C]323.489321[/C][C]8.8578[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M2[/C][C]2990.73936581593[/C][C]320.950123[/C][C]9.3184[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M3[/C][C]2574.47459396752[/C][C]320.435791[/C][C]8.0343[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M4[/C][C]1298.41167826759[/C][C]320.247166[/C][C]4.0544[/C][C]0.000192[/C][C]9.6e-05[/C][/ROW]
[ROW][C]M5[/C][C]1652.84783449343[/C][C]319.971676[/C][C]5.1656[/C][C]5e-06[/C][C]3e-06[/C][/ROW]
[ROW][C]M6[/C][C]1637.48399071926[/C][C]319.747027[/C][C]5.1212[/C][C]6e-06[/C][C]3e-06[/C][/ROW]
[ROW][C]M7[/C][C]3575.92014694509[/C][C]319.573326[/C][C]11.1897[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M8[/C][C]2087.95630317092[/C][C]319.450657[/C][C]6.5361[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M9[/C][C]2093.59245939675[/C][C]319.379079[/C][C]6.5552[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M10[/C][C]2979.92768754834[/C][C]318.467604[/C][C]9.3571[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M11[/C][C]1111.96384377417[/C][C]318.390675[/C][C]3.4925[/C][C]0.001069[/C][C]0.000534[/C][/ROW]
[ROW][C]t[/C][C]74.3638437741686[/C][C]4.041155[/C][C]18.4016[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25427&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25427&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)9033.29419953596311.83472628.968200
x-101.495359628771132.805895-0.76420.4486280.224314
M12865.40413766434323.4893218.857800
M22990.73936581593320.9501239.318400
M32574.47459396752320.4357918.034300
M41298.41167826759320.2471664.05440.0001929.6e-05
M51652.84783449343319.9716765.16565e-063e-06
M61637.48399071926319.7470275.12126e-063e-06
M73575.92014694509319.57332611.189700
M82087.95630317092319.4506576.536100
M92093.59245939675319.3790796.555200
M102979.92768754834318.4676049.357100
M111111.96384377417318.3906753.49250.0010690.000534
t74.36384377416864.04115518.401600







Multiple Linear Regression - Regression Statistics
Multiple R0.960293179246709
R-squared0.922162990107751
Adjusted R-squared0.900165574268637
F-TEST (value)41.9214237186917
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation503.379307808244
Sum Squared Residuals11655973.4663573

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.960293179246709 \tabularnewline
R-squared & 0.922162990107751 \tabularnewline
Adjusted R-squared & 0.900165574268637 \tabularnewline
F-TEST (value) & 41.9214237186917 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 46 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 503.379307808244 \tabularnewline
Sum Squared Residuals & 11655973.4663573 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25427&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.960293179246709[/C][/ROW]
[ROW][C]R-squared[/C][C]0.922162990107751[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.900165574268637[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]41.9214237186917[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]46[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]503.379307808244[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]11655973.4663573[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25427&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25427&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.960293179246709
R-squared0.922162990107751
Adjusted R-squared0.900165574268637
F-TEST (value)41.9214237186917
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation503.379307808244
Sum Squared Residuals11655973.4663573







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11210311871.5668213457231.433178654281
21298912071.2658932715917.734106728538
31161011729.3649651972-119.364965197215
41020610527.6658932715-321.665893271461
51135610956.4658932715399.534106728539
61130711015.4658932715291.534106728539
71264913028.2658932715-379.265893271461
81194711614.6658932715332.334106728538
91171411796.1612529002-82.1612529002318
101219312655.3649651972-462.364965197215
111126910861.7649651972407.235034802785
1290979824.16496519721-727.164965197215
131264012763.9329466357-123.932946635728
141304012963.632018561576.3679814385146
151168712621.7310904872-934.731090487239
161119211420.0320185615-228.032018561485
171139211848.8320185615-456.832018561485
181179311907.8320185615-114.832018561485
191393313920.632018561512.3679814385150
201277812507.0320185615270.967981438515
211181012485.5366589327-675.536658932714
221369813446.2357308585251.764269141531
231195711652.6357308585304.364269141531
241072410615.0357308585108.964269141531
251393913656.2990719258282.700928074249
261398013754.5027842227225.497215777263
271380713412.6018561485394.398143851508
281297412312.3981438515661.601856148492
291251012639.7027842227-129.702784222738
301293412698.7027842227235.297215777263
311490814711.5027842227196.497215777262
321377213297.9027842227474.097215777262
331301313377.9027842227-364.902784222738
341405014338.6018561485-288.601856148492
351181712545.0018561485-728.001856148492
361159311507.401856148585.5981438515079
371446614447.16983758718.8301624129955
381361614646.8689095128-1030.86890951276
391473414304.9679814385429.032018561485
401388113103.2689095128777.731090487239
411352813532.0689095128-4.06890951276136
421358413591.0689095128-7.0689095127609
431617015603.8689095128566.131090487239
441326114190.2689095128-929.26890951276
451474214270.2689095128471.731090487239
461548715230.9679814385256.032018561484
471315513437.3679814385-282.367981438516
481262112399.7679814385221.232018561485
491503215441.0313225058-409.031322505798
501545215640.7303944316-188.730394431555
511542815197.3341067285230.665893271461
521310613995.6350348028-889.635034802785
531471714525.9303944316191.069605568445
541418014584.9303944316-404.930394431556
551620216597.7303944316-395.730394431555
561503615184.1303944316-148.130394431555
571591515264.1303944316650.869605568446
581646816224.8294663573243.170533642691
591473014431.2294663573298.770533642691
601370513393.6294663573311.370533642690

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 12103 & 11871.5668213457 & 231.433178654281 \tabularnewline
2 & 12989 & 12071.2658932715 & 917.734106728538 \tabularnewline
3 & 11610 & 11729.3649651972 & -119.364965197215 \tabularnewline
4 & 10206 & 10527.6658932715 & -321.665893271461 \tabularnewline
5 & 11356 & 10956.4658932715 & 399.534106728539 \tabularnewline
6 & 11307 & 11015.4658932715 & 291.534106728539 \tabularnewline
7 & 12649 & 13028.2658932715 & -379.265893271461 \tabularnewline
8 & 11947 & 11614.6658932715 & 332.334106728538 \tabularnewline
9 & 11714 & 11796.1612529002 & -82.1612529002318 \tabularnewline
10 & 12193 & 12655.3649651972 & -462.364965197215 \tabularnewline
11 & 11269 & 10861.7649651972 & 407.235034802785 \tabularnewline
12 & 9097 & 9824.16496519721 & -727.164965197215 \tabularnewline
13 & 12640 & 12763.9329466357 & -123.932946635728 \tabularnewline
14 & 13040 & 12963.6320185615 & 76.3679814385146 \tabularnewline
15 & 11687 & 12621.7310904872 & -934.731090487239 \tabularnewline
16 & 11192 & 11420.0320185615 & -228.032018561485 \tabularnewline
17 & 11392 & 11848.8320185615 & -456.832018561485 \tabularnewline
18 & 11793 & 11907.8320185615 & -114.832018561485 \tabularnewline
19 & 13933 & 13920.6320185615 & 12.3679814385150 \tabularnewline
20 & 12778 & 12507.0320185615 & 270.967981438515 \tabularnewline
21 & 11810 & 12485.5366589327 & -675.536658932714 \tabularnewline
22 & 13698 & 13446.2357308585 & 251.764269141531 \tabularnewline
23 & 11957 & 11652.6357308585 & 304.364269141531 \tabularnewline
24 & 10724 & 10615.0357308585 & 108.964269141531 \tabularnewline
25 & 13939 & 13656.2990719258 & 282.700928074249 \tabularnewline
26 & 13980 & 13754.5027842227 & 225.497215777263 \tabularnewline
27 & 13807 & 13412.6018561485 & 394.398143851508 \tabularnewline
28 & 12974 & 12312.3981438515 & 661.601856148492 \tabularnewline
29 & 12510 & 12639.7027842227 & -129.702784222738 \tabularnewline
30 & 12934 & 12698.7027842227 & 235.297215777263 \tabularnewline
31 & 14908 & 14711.5027842227 & 196.497215777262 \tabularnewline
32 & 13772 & 13297.9027842227 & 474.097215777262 \tabularnewline
33 & 13013 & 13377.9027842227 & -364.902784222738 \tabularnewline
34 & 14050 & 14338.6018561485 & -288.601856148492 \tabularnewline
35 & 11817 & 12545.0018561485 & -728.001856148492 \tabularnewline
36 & 11593 & 11507.4018561485 & 85.5981438515079 \tabularnewline
37 & 14466 & 14447.169837587 & 18.8301624129955 \tabularnewline
38 & 13616 & 14646.8689095128 & -1030.86890951276 \tabularnewline
39 & 14734 & 14304.9679814385 & 429.032018561485 \tabularnewline
40 & 13881 & 13103.2689095128 & 777.731090487239 \tabularnewline
41 & 13528 & 13532.0689095128 & -4.06890951276136 \tabularnewline
42 & 13584 & 13591.0689095128 & -7.0689095127609 \tabularnewline
43 & 16170 & 15603.8689095128 & 566.131090487239 \tabularnewline
44 & 13261 & 14190.2689095128 & -929.26890951276 \tabularnewline
45 & 14742 & 14270.2689095128 & 471.731090487239 \tabularnewline
46 & 15487 & 15230.9679814385 & 256.032018561484 \tabularnewline
47 & 13155 & 13437.3679814385 & -282.367981438516 \tabularnewline
48 & 12621 & 12399.7679814385 & 221.232018561485 \tabularnewline
49 & 15032 & 15441.0313225058 & -409.031322505798 \tabularnewline
50 & 15452 & 15640.7303944316 & -188.730394431555 \tabularnewline
51 & 15428 & 15197.3341067285 & 230.665893271461 \tabularnewline
52 & 13106 & 13995.6350348028 & -889.635034802785 \tabularnewline
53 & 14717 & 14525.9303944316 & 191.069605568445 \tabularnewline
54 & 14180 & 14584.9303944316 & -404.930394431556 \tabularnewline
55 & 16202 & 16597.7303944316 & -395.730394431555 \tabularnewline
56 & 15036 & 15184.1303944316 & -148.130394431555 \tabularnewline
57 & 15915 & 15264.1303944316 & 650.869605568446 \tabularnewline
58 & 16468 & 16224.8294663573 & 243.170533642691 \tabularnewline
59 & 14730 & 14431.2294663573 & 298.770533642691 \tabularnewline
60 & 13705 & 13393.6294663573 & 311.370533642690 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25427&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]12103[/C][C]11871.5668213457[/C][C]231.433178654281[/C][/ROW]
[ROW][C]2[/C][C]12989[/C][C]12071.2658932715[/C][C]917.734106728538[/C][/ROW]
[ROW][C]3[/C][C]11610[/C][C]11729.3649651972[/C][C]-119.364965197215[/C][/ROW]
[ROW][C]4[/C][C]10206[/C][C]10527.6658932715[/C][C]-321.665893271461[/C][/ROW]
[ROW][C]5[/C][C]11356[/C][C]10956.4658932715[/C][C]399.534106728539[/C][/ROW]
[ROW][C]6[/C][C]11307[/C][C]11015.4658932715[/C][C]291.534106728539[/C][/ROW]
[ROW][C]7[/C][C]12649[/C][C]13028.2658932715[/C][C]-379.265893271461[/C][/ROW]
[ROW][C]8[/C][C]11947[/C][C]11614.6658932715[/C][C]332.334106728538[/C][/ROW]
[ROW][C]9[/C][C]11714[/C][C]11796.1612529002[/C][C]-82.1612529002318[/C][/ROW]
[ROW][C]10[/C][C]12193[/C][C]12655.3649651972[/C][C]-462.364965197215[/C][/ROW]
[ROW][C]11[/C][C]11269[/C][C]10861.7649651972[/C][C]407.235034802785[/C][/ROW]
[ROW][C]12[/C][C]9097[/C][C]9824.16496519721[/C][C]-727.164965197215[/C][/ROW]
[ROW][C]13[/C][C]12640[/C][C]12763.9329466357[/C][C]-123.932946635728[/C][/ROW]
[ROW][C]14[/C][C]13040[/C][C]12963.6320185615[/C][C]76.3679814385146[/C][/ROW]
[ROW][C]15[/C][C]11687[/C][C]12621.7310904872[/C][C]-934.731090487239[/C][/ROW]
[ROW][C]16[/C][C]11192[/C][C]11420.0320185615[/C][C]-228.032018561485[/C][/ROW]
[ROW][C]17[/C][C]11392[/C][C]11848.8320185615[/C][C]-456.832018561485[/C][/ROW]
[ROW][C]18[/C][C]11793[/C][C]11907.8320185615[/C][C]-114.832018561485[/C][/ROW]
[ROW][C]19[/C][C]13933[/C][C]13920.6320185615[/C][C]12.3679814385150[/C][/ROW]
[ROW][C]20[/C][C]12778[/C][C]12507.0320185615[/C][C]270.967981438515[/C][/ROW]
[ROW][C]21[/C][C]11810[/C][C]12485.5366589327[/C][C]-675.536658932714[/C][/ROW]
[ROW][C]22[/C][C]13698[/C][C]13446.2357308585[/C][C]251.764269141531[/C][/ROW]
[ROW][C]23[/C][C]11957[/C][C]11652.6357308585[/C][C]304.364269141531[/C][/ROW]
[ROW][C]24[/C][C]10724[/C][C]10615.0357308585[/C][C]108.964269141531[/C][/ROW]
[ROW][C]25[/C][C]13939[/C][C]13656.2990719258[/C][C]282.700928074249[/C][/ROW]
[ROW][C]26[/C][C]13980[/C][C]13754.5027842227[/C][C]225.497215777263[/C][/ROW]
[ROW][C]27[/C][C]13807[/C][C]13412.6018561485[/C][C]394.398143851508[/C][/ROW]
[ROW][C]28[/C][C]12974[/C][C]12312.3981438515[/C][C]661.601856148492[/C][/ROW]
[ROW][C]29[/C][C]12510[/C][C]12639.7027842227[/C][C]-129.702784222738[/C][/ROW]
[ROW][C]30[/C][C]12934[/C][C]12698.7027842227[/C][C]235.297215777263[/C][/ROW]
[ROW][C]31[/C][C]14908[/C][C]14711.5027842227[/C][C]196.497215777262[/C][/ROW]
[ROW][C]32[/C][C]13772[/C][C]13297.9027842227[/C][C]474.097215777262[/C][/ROW]
[ROW][C]33[/C][C]13013[/C][C]13377.9027842227[/C][C]-364.902784222738[/C][/ROW]
[ROW][C]34[/C][C]14050[/C][C]14338.6018561485[/C][C]-288.601856148492[/C][/ROW]
[ROW][C]35[/C][C]11817[/C][C]12545.0018561485[/C][C]-728.001856148492[/C][/ROW]
[ROW][C]36[/C][C]11593[/C][C]11507.4018561485[/C][C]85.5981438515079[/C][/ROW]
[ROW][C]37[/C][C]14466[/C][C]14447.169837587[/C][C]18.8301624129955[/C][/ROW]
[ROW][C]38[/C][C]13616[/C][C]14646.8689095128[/C][C]-1030.86890951276[/C][/ROW]
[ROW][C]39[/C][C]14734[/C][C]14304.9679814385[/C][C]429.032018561485[/C][/ROW]
[ROW][C]40[/C][C]13881[/C][C]13103.2689095128[/C][C]777.731090487239[/C][/ROW]
[ROW][C]41[/C][C]13528[/C][C]13532.0689095128[/C][C]-4.06890951276136[/C][/ROW]
[ROW][C]42[/C][C]13584[/C][C]13591.0689095128[/C][C]-7.0689095127609[/C][/ROW]
[ROW][C]43[/C][C]16170[/C][C]15603.8689095128[/C][C]566.131090487239[/C][/ROW]
[ROW][C]44[/C][C]13261[/C][C]14190.2689095128[/C][C]-929.26890951276[/C][/ROW]
[ROW][C]45[/C][C]14742[/C][C]14270.2689095128[/C][C]471.731090487239[/C][/ROW]
[ROW][C]46[/C][C]15487[/C][C]15230.9679814385[/C][C]256.032018561484[/C][/ROW]
[ROW][C]47[/C][C]13155[/C][C]13437.3679814385[/C][C]-282.367981438516[/C][/ROW]
[ROW][C]48[/C][C]12621[/C][C]12399.7679814385[/C][C]221.232018561485[/C][/ROW]
[ROW][C]49[/C][C]15032[/C][C]15441.0313225058[/C][C]-409.031322505798[/C][/ROW]
[ROW][C]50[/C][C]15452[/C][C]15640.7303944316[/C][C]-188.730394431555[/C][/ROW]
[ROW][C]51[/C][C]15428[/C][C]15197.3341067285[/C][C]230.665893271461[/C][/ROW]
[ROW][C]52[/C][C]13106[/C][C]13995.6350348028[/C][C]-889.635034802785[/C][/ROW]
[ROW][C]53[/C][C]14717[/C][C]14525.9303944316[/C][C]191.069605568445[/C][/ROW]
[ROW][C]54[/C][C]14180[/C][C]14584.9303944316[/C][C]-404.930394431556[/C][/ROW]
[ROW][C]55[/C][C]16202[/C][C]16597.7303944316[/C][C]-395.730394431555[/C][/ROW]
[ROW][C]56[/C][C]15036[/C][C]15184.1303944316[/C][C]-148.130394431555[/C][/ROW]
[ROW][C]57[/C][C]15915[/C][C]15264.1303944316[/C][C]650.869605568446[/C][/ROW]
[ROW][C]58[/C][C]16468[/C][C]16224.8294663573[/C][C]243.170533642691[/C][/ROW]
[ROW][C]59[/C][C]14730[/C][C]14431.2294663573[/C][C]298.770533642691[/C][/ROW]
[ROW][C]60[/C][C]13705[/C][C]13393.6294663573[/C][C]311.370533642690[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25427&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25427&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11210311871.5668213457231.433178654281
21298912071.2658932715917.734106728538
31161011729.3649651972-119.364965197215
41020610527.6658932715-321.665893271461
51135610956.4658932715399.534106728539
61130711015.4658932715291.534106728539
71264913028.2658932715-379.265893271461
81194711614.6658932715332.334106728538
91171411796.1612529002-82.1612529002318
101219312655.3649651972-462.364965197215
111126910861.7649651972407.235034802785
1290979824.16496519721-727.164965197215
131264012763.9329466357-123.932946635728
141304012963.632018561576.3679814385146
151168712621.7310904872-934.731090487239
161119211420.0320185615-228.032018561485
171139211848.8320185615-456.832018561485
181179311907.8320185615-114.832018561485
191393313920.632018561512.3679814385150
201277812507.0320185615270.967981438515
211181012485.5366589327-675.536658932714
221369813446.2357308585251.764269141531
231195711652.6357308585304.364269141531
241072410615.0357308585108.964269141531
251393913656.2990719258282.700928074249
261398013754.5027842227225.497215777263
271380713412.6018561485394.398143851508
281297412312.3981438515661.601856148492
291251012639.7027842227-129.702784222738
301293412698.7027842227235.297215777263
311490814711.5027842227196.497215777262
321377213297.9027842227474.097215777262
331301313377.9027842227-364.902784222738
341405014338.6018561485-288.601856148492
351181712545.0018561485-728.001856148492
361159311507.401856148585.5981438515079
371446614447.16983758718.8301624129955
381361614646.8689095128-1030.86890951276
391473414304.9679814385429.032018561485
401388113103.2689095128777.731090487239
411352813532.0689095128-4.06890951276136
421358413591.0689095128-7.0689095127609
431617015603.8689095128566.131090487239
441326114190.2689095128-929.26890951276
451474214270.2689095128471.731090487239
461548715230.9679814385256.032018561484
471315513437.3679814385-282.367981438516
481262112399.7679814385221.232018561485
491503215441.0313225058-409.031322505798
501545215640.7303944316-188.730394431555
511542815197.3341067285230.665893271461
521310613995.6350348028-889.635034802785
531471714525.9303944316191.069605568445
541418014584.9303944316-404.930394431556
551620216597.7303944316-395.730394431555
561503615184.1303944316-148.130394431555
571591515264.1303944316650.869605568446
581646816224.8294663573243.170533642691
591473014431.2294663573298.770533642691
601370513393.6294663573311.370533642690



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')