Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSun, 23 Nov 2008 07:55:53 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Nov/23/t1227452239b1lbitu1xfrh3nn.htm/, Retrieved Sat, 18 May 2024 02:14:38 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=25277, Retrieved Sat, 18 May 2024 02:14:38 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact168
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
F     [Multiple Regression] [] [2007-11-19 19:55:31] [b731da8b544846036771bbf9bf2f34ce]
-    D  [Multiple Regression] [seatbelt_3] [2008-11-23 14:35:32] [922d8ae7bd2fd460a62d9020ccd4931a]
F    D      [Multiple Regression] [seatbelt3CG] [2008-11-23 14:55:53] [89a49ebb3ece8e9a225c7f9f53a14c57] [Current]
Feedback Forum
2008-11-29 15:20:03 [Thomas Plasschaert] [reply
zeer goede uitwerking en interpretatie
2008-11-30 13:03:54 [6066575aa30c0611e452e930b1dff53d] [reply
Bij deze vraag werd er geen rekening gehouden met seasonal dummies en linear trend. Als dummy variabele werd de financiële crisis genomen. Er werd besloten uit de actuals and interpolation grafiek dat er sprake is van een positieve trend en dat er ook een niveauverschil optreedt.
2008-12-01 11:32:50 [Jessica Alves Pires] [reply
Ik vind deze berekening een beetje overbodig want er wordt geen vergelijk gemaakt tussen de berekeningen met en zonder linear trend+dummies. De student had bijvoorbeeld kunnen zeggen dat de standdaardfout daalt van 9.48213286856506 naar 5.00612210608242 met invoering van dummies en linear trend. Of dat de R-squared gestegen is van 0.066921912517871 naar 0.777520986287843. Men krijgt dus een nauwkeuriger voorspelling met de invoering van dummies en linear trend. Ook heeft de student maar 1 grafiek besproken in het word document.

Post a new message
Dataseries X:
97,8	0
107,4	0
117,5	0
105,6	0
97,4	0
99,5	0
98	0
104,3	0
100,6	0
101,1	0
103,9	0
96,9	0
95,5	0
108,4	0
117	0
103,8	0
100,8	0
110,6	0
104	0
112,6	0
107,3	0
98,9	0
109,8	0
104,9	0
102,2	0
123,9	0
124,9	0
112,7	0
121,9	0
100,6	0
104,3	0
120,4	0
107,5	0
102,9	0
125,6	0
107,5	0
108,8	0
128,4	0
121,1	0
119,5	0
128,7	0
108,7	0
105,5	0
119,8	0
111,3	0
110,6	0
120,1	0
97,5	0
107,7	0
127,3	0
117,2	0
119,8	0
116,2	0
111	0
112,4	0
130,6	0
109,1	0
118,8	0
123,9	0
101,6	0
112,8	0
128	0
129,6	0
125,8	0
119,5	0
115,7	0
113,6	0
129,7	0
112	0
116,8	0
127	0
112,1	1
114,2	1
121,1	1
131,6	1
125	1
120,4	1
117,7	1
117,5	1
120,6	1
127,5	1
112,3	1
124,5	1
115,2	1
105,4	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25277&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25277&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25277&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Consumptiegoederen[t] = + 112.170422535211 + 6.76529175050303`Wel(1)_geen(0)_financiële_crisis`[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Consumptiegoederen[t] =  +  112.170422535211 +  6.76529175050303`Wel(1)_geen(0)_financiële_crisis`[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25277&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Consumptiegoederen[t] =  +  112.170422535211 +  6.76529175050303`Wel(1)_geen(0)_financiële_crisis`[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25277&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25277&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Consumptiegoederen[t] = + 112.170422535211 + 6.76529175050303`Wel(1)_geen(0)_financiële_crisis`[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)112.1704225352111.12532299.678500
`Wel(1)_geen(0)_financiële_crisis`6.765291750503032.7728242.43990.0168230.008411

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 112.170422535211 & 1.125322 & 99.6785 & 0 & 0 \tabularnewline
`Wel(1)_geen(0)_financiële_crisis` & 6.76529175050303 & 2.772824 & 2.4399 & 0.016823 & 0.008411 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25277&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]112.170422535211[/C][C]1.125322[/C][C]99.6785[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]`Wel(1)_geen(0)_financiële_crisis`[/C][C]6.76529175050303[/C][C]2.772824[/C][C]2.4399[/C][C]0.016823[/C][C]0.008411[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25277&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25277&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)112.1704225352111.12532299.678500
`Wel(1)_geen(0)_financiële_crisis`6.765291750503032.7728242.43990.0168230.008411







Multiple Linear Regression - Regression Statistics
Multiple R0.258692699003801
R-squared0.066921912517871
Adjusted R-squared0.0556800078494116
F-TEST (value)5.95289806233894
F-TEST (DF numerator)1
F-TEST (DF denominator)83
p-value0.016822656077361
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation9.48213286856506
Sum Squared Residuals7462.60003018111

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.258692699003801 \tabularnewline
R-squared & 0.066921912517871 \tabularnewline
Adjusted R-squared & 0.0556800078494116 \tabularnewline
F-TEST (value) & 5.95289806233894 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 83 \tabularnewline
p-value & 0.016822656077361 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 9.48213286856506 \tabularnewline
Sum Squared Residuals & 7462.60003018111 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25277&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.258692699003801[/C][/ROW]
[ROW][C]R-squared[/C][C]0.066921912517871[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0556800078494116[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]5.95289806233894[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]83[/C][/ROW]
[ROW][C]p-value[/C][C]0.016822656077361[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]9.48213286856506[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]7462.60003018111[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25277&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25277&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.258692699003801
R-squared0.066921912517871
Adjusted R-squared0.0556800078494116
F-TEST (value)5.95289806233894
F-TEST (DF numerator)1
F-TEST (DF denominator)83
p-value0.016822656077361
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation9.48213286856506
Sum Squared Residuals7462.60003018111







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
197.8112.170422535212-14.3704225352120
2107.4112.170422535211-4.77042253521125
3117.5112.1704225352115.32957746478874
4105.6112.170422535211-6.57042253521126
597.4112.170422535211-14.7704225352113
699.5112.170422535211-12.6704225352113
798112.170422535211-14.1704225352113
8104.3112.170422535211-7.87042253521126
9100.6112.170422535211-11.5704225352113
10101.1112.170422535211-11.0704225352113
11103.9112.170422535211-8.27042253521125
1296.9112.170422535211-15.2704225352113
1395.5112.170422535211-16.6704225352113
14108.4112.170422535211-3.77042253521125
15117112.1704225352114.82957746478874
16103.8112.170422535211-8.37042253521126
17100.8112.170422535211-11.3704225352113
18110.6112.170422535211-1.57042253521126
19104112.170422535211-8.17042253521126
20112.6112.1704225352110.429577464788739
21107.3112.170422535211-4.87042253521126
2298.9112.170422535211-13.2704225352113
23109.8112.170422535211-2.37042253521126
24104.9112.170422535211-7.27042253521125
25102.2112.170422535211-9.97042253521125
26123.9112.17042253521111.7295774647887
27124.9112.17042253521112.7295774647887
28112.7112.1704225352110.529577464788748
29121.9112.1704225352119.72957746478875
30100.6112.170422535211-11.5704225352113
31104.3112.170422535211-7.87042253521126
32120.4112.1704225352118.22957746478875
33107.5112.170422535211-4.67042253521126
34102.9112.170422535211-9.27042253521125
35125.6112.17042253521113.4295774647887
36107.5112.170422535211-4.67042253521126
37108.8112.170422535211-3.37042253521126
38128.4112.17042253521116.2295774647888
39121.1112.1704225352118.92957746478874
40119.5112.1704225352117.32957746478874
41128.7112.17042253521116.5295774647887
42108.7112.170422535211-3.47042253521125
43105.5112.170422535211-6.67042253521126
44119.8112.1704225352117.62957746478874
45111.3112.170422535211-0.870422535211258
46110.6112.170422535211-1.57042253521126
47120.1112.1704225352117.92957746478874
4897.5112.170422535211-14.6704225352113
49107.7112.170422535211-4.47042253521125
50127.3112.17042253521115.1295774647887
51117.2112.1704225352115.02957746478875
52119.8112.1704225352117.62957746478874
53116.2112.1704225352114.02957746478875
54111112.170422535211-1.17042253521126
55112.4112.1704225352110.229577464788750
56130.6112.17042253521118.4295774647887
57109.1112.170422535211-3.07042253521126
58118.8112.1704225352116.62957746478874
59123.9112.17042253521111.7295774647887
60101.6112.170422535211-10.5704225352113
61112.8112.1704225352110.629577464788742
62128112.17042253521115.8295774647887
63129.6112.17042253521117.4295774647887
64125.8112.17042253521113.6295774647887
65119.5112.1704225352117.32957746478874
66115.7112.1704225352113.52957746478875
67113.6112.1704225352111.42957746478874
68129.7112.17042253521117.5295774647887
69112112.170422535211-0.170422535211255
70116.8112.1704225352114.62957746478874
71127112.17042253521114.8295774647887
72112.1118.935714285714-6.83571428571429
73114.2118.935714285714-4.73571428571428
74121.1118.9357142857142.16428571428571
75131.6118.93571428571412.6642857142857
76125118.9357142857146.06428571428571
77120.4118.9357142857141.46428571428572
78117.7118.935714285714-1.23571428571428
79117.5118.935714285714-1.43571428571429
80120.6118.9357142857141.66428571428571
81127.5118.9357142857148.56428571428571
82112.3118.935714285714-6.63571428571429
83124.5118.9357142857145.56428571428571
84115.2118.935714285714-3.73571428571428
85105.4118.935714285714-13.5357142857143

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 97.8 & 112.170422535212 & -14.3704225352120 \tabularnewline
2 & 107.4 & 112.170422535211 & -4.77042253521125 \tabularnewline
3 & 117.5 & 112.170422535211 & 5.32957746478874 \tabularnewline
4 & 105.6 & 112.170422535211 & -6.57042253521126 \tabularnewline
5 & 97.4 & 112.170422535211 & -14.7704225352113 \tabularnewline
6 & 99.5 & 112.170422535211 & -12.6704225352113 \tabularnewline
7 & 98 & 112.170422535211 & -14.1704225352113 \tabularnewline
8 & 104.3 & 112.170422535211 & -7.87042253521126 \tabularnewline
9 & 100.6 & 112.170422535211 & -11.5704225352113 \tabularnewline
10 & 101.1 & 112.170422535211 & -11.0704225352113 \tabularnewline
11 & 103.9 & 112.170422535211 & -8.27042253521125 \tabularnewline
12 & 96.9 & 112.170422535211 & -15.2704225352113 \tabularnewline
13 & 95.5 & 112.170422535211 & -16.6704225352113 \tabularnewline
14 & 108.4 & 112.170422535211 & -3.77042253521125 \tabularnewline
15 & 117 & 112.170422535211 & 4.82957746478874 \tabularnewline
16 & 103.8 & 112.170422535211 & -8.37042253521126 \tabularnewline
17 & 100.8 & 112.170422535211 & -11.3704225352113 \tabularnewline
18 & 110.6 & 112.170422535211 & -1.57042253521126 \tabularnewline
19 & 104 & 112.170422535211 & -8.17042253521126 \tabularnewline
20 & 112.6 & 112.170422535211 & 0.429577464788739 \tabularnewline
21 & 107.3 & 112.170422535211 & -4.87042253521126 \tabularnewline
22 & 98.9 & 112.170422535211 & -13.2704225352113 \tabularnewline
23 & 109.8 & 112.170422535211 & -2.37042253521126 \tabularnewline
24 & 104.9 & 112.170422535211 & -7.27042253521125 \tabularnewline
25 & 102.2 & 112.170422535211 & -9.97042253521125 \tabularnewline
26 & 123.9 & 112.170422535211 & 11.7295774647887 \tabularnewline
27 & 124.9 & 112.170422535211 & 12.7295774647887 \tabularnewline
28 & 112.7 & 112.170422535211 & 0.529577464788748 \tabularnewline
29 & 121.9 & 112.170422535211 & 9.72957746478875 \tabularnewline
30 & 100.6 & 112.170422535211 & -11.5704225352113 \tabularnewline
31 & 104.3 & 112.170422535211 & -7.87042253521126 \tabularnewline
32 & 120.4 & 112.170422535211 & 8.22957746478875 \tabularnewline
33 & 107.5 & 112.170422535211 & -4.67042253521126 \tabularnewline
34 & 102.9 & 112.170422535211 & -9.27042253521125 \tabularnewline
35 & 125.6 & 112.170422535211 & 13.4295774647887 \tabularnewline
36 & 107.5 & 112.170422535211 & -4.67042253521126 \tabularnewline
37 & 108.8 & 112.170422535211 & -3.37042253521126 \tabularnewline
38 & 128.4 & 112.170422535211 & 16.2295774647888 \tabularnewline
39 & 121.1 & 112.170422535211 & 8.92957746478874 \tabularnewline
40 & 119.5 & 112.170422535211 & 7.32957746478874 \tabularnewline
41 & 128.7 & 112.170422535211 & 16.5295774647887 \tabularnewline
42 & 108.7 & 112.170422535211 & -3.47042253521125 \tabularnewline
43 & 105.5 & 112.170422535211 & -6.67042253521126 \tabularnewline
44 & 119.8 & 112.170422535211 & 7.62957746478874 \tabularnewline
45 & 111.3 & 112.170422535211 & -0.870422535211258 \tabularnewline
46 & 110.6 & 112.170422535211 & -1.57042253521126 \tabularnewline
47 & 120.1 & 112.170422535211 & 7.92957746478874 \tabularnewline
48 & 97.5 & 112.170422535211 & -14.6704225352113 \tabularnewline
49 & 107.7 & 112.170422535211 & -4.47042253521125 \tabularnewline
50 & 127.3 & 112.170422535211 & 15.1295774647887 \tabularnewline
51 & 117.2 & 112.170422535211 & 5.02957746478875 \tabularnewline
52 & 119.8 & 112.170422535211 & 7.62957746478874 \tabularnewline
53 & 116.2 & 112.170422535211 & 4.02957746478875 \tabularnewline
54 & 111 & 112.170422535211 & -1.17042253521126 \tabularnewline
55 & 112.4 & 112.170422535211 & 0.229577464788750 \tabularnewline
56 & 130.6 & 112.170422535211 & 18.4295774647887 \tabularnewline
57 & 109.1 & 112.170422535211 & -3.07042253521126 \tabularnewline
58 & 118.8 & 112.170422535211 & 6.62957746478874 \tabularnewline
59 & 123.9 & 112.170422535211 & 11.7295774647887 \tabularnewline
60 & 101.6 & 112.170422535211 & -10.5704225352113 \tabularnewline
61 & 112.8 & 112.170422535211 & 0.629577464788742 \tabularnewline
62 & 128 & 112.170422535211 & 15.8295774647887 \tabularnewline
63 & 129.6 & 112.170422535211 & 17.4295774647887 \tabularnewline
64 & 125.8 & 112.170422535211 & 13.6295774647887 \tabularnewline
65 & 119.5 & 112.170422535211 & 7.32957746478874 \tabularnewline
66 & 115.7 & 112.170422535211 & 3.52957746478875 \tabularnewline
67 & 113.6 & 112.170422535211 & 1.42957746478874 \tabularnewline
68 & 129.7 & 112.170422535211 & 17.5295774647887 \tabularnewline
69 & 112 & 112.170422535211 & -0.170422535211255 \tabularnewline
70 & 116.8 & 112.170422535211 & 4.62957746478874 \tabularnewline
71 & 127 & 112.170422535211 & 14.8295774647887 \tabularnewline
72 & 112.1 & 118.935714285714 & -6.83571428571429 \tabularnewline
73 & 114.2 & 118.935714285714 & -4.73571428571428 \tabularnewline
74 & 121.1 & 118.935714285714 & 2.16428571428571 \tabularnewline
75 & 131.6 & 118.935714285714 & 12.6642857142857 \tabularnewline
76 & 125 & 118.935714285714 & 6.06428571428571 \tabularnewline
77 & 120.4 & 118.935714285714 & 1.46428571428572 \tabularnewline
78 & 117.7 & 118.935714285714 & -1.23571428571428 \tabularnewline
79 & 117.5 & 118.935714285714 & -1.43571428571429 \tabularnewline
80 & 120.6 & 118.935714285714 & 1.66428571428571 \tabularnewline
81 & 127.5 & 118.935714285714 & 8.56428571428571 \tabularnewline
82 & 112.3 & 118.935714285714 & -6.63571428571429 \tabularnewline
83 & 124.5 & 118.935714285714 & 5.56428571428571 \tabularnewline
84 & 115.2 & 118.935714285714 & -3.73571428571428 \tabularnewline
85 & 105.4 & 118.935714285714 & -13.5357142857143 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25277&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]97.8[/C][C]112.170422535212[/C][C]-14.3704225352120[/C][/ROW]
[ROW][C]2[/C][C]107.4[/C][C]112.170422535211[/C][C]-4.77042253521125[/C][/ROW]
[ROW][C]3[/C][C]117.5[/C][C]112.170422535211[/C][C]5.32957746478874[/C][/ROW]
[ROW][C]4[/C][C]105.6[/C][C]112.170422535211[/C][C]-6.57042253521126[/C][/ROW]
[ROW][C]5[/C][C]97.4[/C][C]112.170422535211[/C][C]-14.7704225352113[/C][/ROW]
[ROW][C]6[/C][C]99.5[/C][C]112.170422535211[/C][C]-12.6704225352113[/C][/ROW]
[ROW][C]7[/C][C]98[/C][C]112.170422535211[/C][C]-14.1704225352113[/C][/ROW]
[ROW][C]8[/C][C]104.3[/C][C]112.170422535211[/C][C]-7.87042253521126[/C][/ROW]
[ROW][C]9[/C][C]100.6[/C][C]112.170422535211[/C][C]-11.5704225352113[/C][/ROW]
[ROW][C]10[/C][C]101.1[/C][C]112.170422535211[/C][C]-11.0704225352113[/C][/ROW]
[ROW][C]11[/C][C]103.9[/C][C]112.170422535211[/C][C]-8.27042253521125[/C][/ROW]
[ROW][C]12[/C][C]96.9[/C][C]112.170422535211[/C][C]-15.2704225352113[/C][/ROW]
[ROW][C]13[/C][C]95.5[/C][C]112.170422535211[/C][C]-16.6704225352113[/C][/ROW]
[ROW][C]14[/C][C]108.4[/C][C]112.170422535211[/C][C]-3.77042253521125[/C][/ROW]
[ROW][C]15[/C][C]117[/C][C]112.170422535211[/C][C]4.82957746478874[/C][/ROW]
[ROW][C]16[/C][C]103.8[/C][C]112.170422535211[/C][C]-8.37042253521126[/C][/ROW]
[ROW][C]17[/C][C]100.8[/C][C]112.170422535211[/C][C]-11.3704225352113[/C][/ROW]
[ROW][C]18[/C][C]110.6[/C][C]112.170422535211[/C][C]-1.57042253521126[/C][/ROW]
[ROW][C]19[/C][C]104[/C][C]112.170422535211[/C][C]-8.17042253521126[/C][/ROW]
[ROW][C]20[/C][C]112.6[/C][C]112.170422535211[/C][C]0.429577464788739[/C][/ROW]
[ROW][C]21[/C][C]107.3[/C][C]112.170422535211[/C][C]-4.87042253521126[/C][/ROW]
[ROW][C]22[/C][C]98.9[/C][C]112.170422535211[/C][C]-13.2704225352113[/C][/ROW]
[ROW][C]23[/C][C]109.8[/C][C]112.170422535211[/C][C]-2.37042253521126[/C][/ROW]
[ROW][C]24[/C][C]104.9[/C][C]112.170422535211[/C][C]-7.27042253521125[/C][/ROW]
[ROW][C]25[/C][C]102.2[/C][C]112.170422535211[/C][C]-9.97042253521125[/C][/ROW]
[ROW][C]26[/C][C]123.9[/C][C]112.170422535211[/C][C]11.7295774647887[/C][/ROW]
[ROW][C]27[/C][C]124.9[/C][C]112.170422535211[/C][C]12.7295774647887[/C][/ROW]
[ROW][C]28[/C][C]112.7[/C][C]112.170422535211[/C][C]0.529577464788748[/C][/ROW]
[ROW][C]29[/C][C]121.9[/C][C]112.170422535211[/C][C]9.72957746478875[/C][/ROW]
[ROW][C]30[/C][C]100.6[/C][C]112.170422535211[/C][C]-11.5704225352113[/C][/ROW]
[ROW][C]31[/C][C]104.3[/C][C]112.170422535211[/C][C]-7.87042253521126[/C][/ROW]
[ROW][C]32[/C][C]120.4[/C][C]112.170422535211[/C][C]8.22957746478875[/C][/ROW]
[ROW][C]33[/C][C]107.5[/C][C]112.170422535211[/C][C]-4.67042253521126[/C][/ROW]
[ROW][C]34[/C][C]102.9[/C][C]112.170422535211[/C][C]-9.27042253521125[/C][/ROW]
[ROW][C]35[/C][C]125.6[/C][C]112.170422535211[/C][C]13.4295774647887[/C][/ROW]
[ROW][C]36[/C][C]107.5[/C][C]112.170422535211[/C][C]-4.67042253521126[/C][/ROW]
[ROW][C]37[/C][C]108.8[/C][C]112.170422535211[/C][C]-3.37042253521126[/C][/ROW]
[ROW][C]38[/C][C]128.4[/C][C]112.170422535211[/C][C]16.2295774647888[/C][/ROW]
[ROW][C]39[/C][C]121.1[/C][C]112.170422535211[/C][C]8.92957746478874[/C][/ROW]
[ROW][C]40[/C][C]119.5[/C][C]112.170422535211[/C][C]7.32957746478874[/C][/ROW]
[ROW][C]41[/C][C]128.7[/C][C]112.170422535211[/C][C]16.5295774647887[/C][/ROW]
[ROW][C]42[/C][C]108.7[/C][C]112.170422535211[/C][C]-3.47042253521125[/C][/ROW]
[ROW][C]43[/C][C]105.5[/C][C]112.170422535211[/C][C]-6.67042253521126[/C][/ROW]
[ROW][C]44[/C][C]119.8[/C][C]112.170422535211[/C][C]7.62957746478874[/C][/ROW]
[ROW][C]45[/C][C]111.3[/C][C]112.170422535211[/C][C]-0.870422535211258[/C][/ROW]
[ROW][C]46[/C][C]110.6[/C][C]112.170422535211[/C][C]-1.57042253521126[/C][/ROW]
[ROW][C]47[/C][C]120.1[/C][C]112.170422535211[/C][C]7.92957746478874[/C][/ROW]
[ROW][C]48[/C][C]97.5[/C][C]112.170422535211[/C][C]-14.6704225352113[/C][/ROW]
[ROW][C]49[/C][C]107.7[/C][C]112.170422535211[/C][C]-4.47042253521125[/C][/ROW]
[ROW][C]50[/C][C]127.3[/C][C]112.170422535211[/C][C]15.1295774647887[/C][/ROW]
[ROW][C]51[/C][C]117.2[/C][C]112.170422535211[/C][C]5.02957746478875[/C][/ROW]
[ROW][C]52[/C][C]119.8[/C][C]112.170422535211[/C][C]7.62957746478874[/C][/ROW]
[ROW][C]53[/C][C]116.2[/C][C]112.170422535211[/C][C]4.02957746478875[/C][/ROW]
[ROW][C]54[/C][C]111[/C][C]112.170422535211[/C][C]-1.17042253521126[/C][/ROW]
[ROW][C]55[/C][C]112.4[/C][C]112.170422535211[/C][C]0.229577464788750[/C][/ROW]
[ROW][C]56[/C][C]130.6[/C][C]112.170422535211[/C][C]18.4295774647887[/C][/ROW]
[ROW][C]57[/C][C]109.1[/C][C]112.170422535211[/C][C]-3.07042253521126[/C][/ROW]
[ROW][C]58[/C][C]118.8[/C][C]112.170422535211[/C][C]6.62957746478874[/C][/ROW]
[ROW][C]59[/C][C]123.9[/C][C]112.170422535211[/C][C]11.7295774647887[/C][/ROW]
[ROW][C]60[/C][C]101.6[/C][C]112.170422535211[/C][C]-10.5704225352113[/C][/ROW]
[ROW][C]61[/C][C]112.8[/C][C]112.170422535211[/C][C]0.629577464788742[/C][/ROW]
[ROW][C]62[/C][C]128[/C][C]112.170422535211[/C][C]15.8295774647887[/C][/ROW]
[ROW][C]63[/C][C]129.6[/C][C]112.170422535211[/C][C]17.4295774647887[/C][/ROW]
[ROW][C]64[/C][C]125.8[/C][C]112.170422535211[/C][C]13.6295774647887[/C][/ROW]
[ROW][C]65[/C][C]119.5[/C][C]112.170422535211[/C][C]7.32957746478874[/C][/ROW]
[ROW][C]66[/C][C]115.7[/C][C]112.170422535211[/C][C]3.52957746478875[/C][/ROW]
[ROW][C]67[/C][C]113.6[/C][C]112.170422535211[/C][C]1.42957746478874[/C][/ROW]
[ROW][C]68[/C][C]129.7[/C][C]112.170422535211[/C][C]17.5295774647887[/C][/ROW]
[ROW][C]69[/C][C]112[/C][C]112.170422535211[/C][C]-0.170422535211255[/C][/ROW]
[ROW][C]70[/C][C]116.8[/C][C]112.170422535211[/C][C]4.62957746478874[/C][/ROW]
[ROW][C]71[/C][C]127[/C][C]112.170422535211[/C][C]14.8295774647887[/C][/ROW]
[ROW][C]72[/C][C]112.1[/C][C]118.935714285714[/C][C]-6.83571428571429[/C][/ROW]
[ROW][C]73[/C][C]114.2[/C][C]118.935714285714[/C][C]-4.73571428571428[/C][/ROW]
[ROW][C]74[/C][C]121.1[/C][C]118.935714285714[/C][C]2.16428571428571[/C][/ROW]
[ROW][C]75[/C][C]131.6[/C][C]118.935714285714[/C][C]12.6642857142857[/C][/ROW]
[ROW][C]76[/C][C]125[/C][C]118.935714285714[/C][C]6.06428571428571[/C][/ROW]
[ROW][C]77[/C][C]120.4[/C][C]118.935714285714[/C][C]1.46428571428572[/C][/ROW]
[ROW][C]78[/C][C]117.7[/C][C]118.935714285714[/C][C]-1.23571428571428[/C][/ROW]
[ROW][C]79[/C][C]117.5[/C][C]118.935714285714[/C][C]-1.43571428571429[/C][/ROW]
[ROW][C]80[/C][C]120.6[/C][C]118.935714285714[/C][C]1.66428571428571[/C][/ROW]
[ROW][C]81[/C][C]127.5[/C][C]118.935714285714[/C][C]8.56428571428571[/C][/ROW]
[ROW][C]82[/C][C]112.3[/C][C]118.935714285714[/C][C]-6.63571428571429[/C][/ROW]
[ROW][C]83[/C][C]124.5[/C][C]118.935714285714[/C][C]5.56428571428571[/C][/ROW]
[ROW][C]84[/C][C]115.2[/C][C]118.935714285714[/C][C]-3.73571428571428[/C][/ROW]
[ROW][C]85[/C][C]105.4[/C][C]118.935714285714[/C][C]-13.5357142857143[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25277&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25277&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
197.8112.170422535212-14.3704225352120
2107.4112.170422535211-4.77042253521125
3117.5112.1704225352115.32957746478874
4105.6112.170422535211-6.57042253521126
597.4112.170422535211-14.7704225352113
699.5112.170422535211-12.6704225352113
798112.170422535211-14.1704225352113
8104.3112.170422535211-7.87042253521126
9100.6112.170422535211-11.5704225352113
10101.1112.170422535211-11.0704225352113
11103.9112.170422535211-8.27042253521125
1296.9112.170422535211-15.2704225352113
1395.5112.170422535211-16.6704225352113
14108.4112.170422535211-3.77042253521125
15117112.1704225352114.82957746478874
16103.8112.170422535211-8.37042253521126
17100.8112.170422535211-11.3704225352113
18110.6112.170422535211-1.57042253521126
19104112.170422535211-8.17042253521126
20112.6112.1704225352110.429577464788739
21107.3112.170422535211-4.87042253521126
2298.9112.170422535211-13.2704225352113
23109.8112.170422535211-2.37042253521126
24104.9112.170422535211-7.27042253521125
25102.2112.170422535211-9.97042253521125
26123.9112.17042253521111.7295774647887
27124.9112.17042253521112.7295774647887
28112.7112.1704225352110.529577464788748
29121.9112.1704225352119.72957746478875
30100.6112.170422535211-11.5704225352113
31104.3112.170422535211-7.87042253521126
32120.4112.1704225352118.22957746478875
33107.5112.170422535211-4.67042253521126
34102.9112.170422535211-9.27042253521125
35125.6112.17042253521113.4295774647887
36107.5112.170422535211-4.67042253521126
37108.8112.170422535211-3.37042253521126
38128.4112.17042253521116.2295774647888
39121.1112.1704225352118.92957746478874
40119.5112.1704225352117.32957746478874
41128.7112.17042253521116.5295774647887
42108.7112.170422535211-3.47042253521125
43105.5112.170422535211-6.67042253521126
44119.8112.1704225352117.62957746478874
45111.3112.170422535211-0.870422535211258
46110.6112.170422535211-1.57042253521126
47120.1112.1704225352117.92957746478874
4897.5112.170422535211-14.6704225352113
49107.7112.170422535211-4.47042253521125
50127.3112.17042253521115.1295774647887
51117.2112.1704225352115.02957746478875
52119.8112.1704225352117.62957746478874
53116.2112.1704225352114.02957746478875
54111112.170422535211-1.17042253521126
55112.4112.1704225352110.229577464788750
56130.6112.17042253521118.4295774647887
57109.1112.170422535211-3.07042253521126
58118.8112.1704225352116.62957746478874
59123.9112.17042253521111.7295774647887
60101.6112.170422535211-10.5704225352113
61112.8112.1704225352110.629577464788742
62128112.17042253521115.8295774647887
63129.6112.17042253521117.4295774647887
64125.8112.17042253521113.6295774647887
65119.5112.1704225352117.32957746478874
66115.7112.1704225352113.52957746478875
67113.6112.1704225352111.42957746478874
68129.7112.17042253521117.5295774647887
69112112.170422535211-0.170422535211255
70116.8112.1704225352114.62957746478874
71127112.17042253521114.8295774647887
72112.1118.935714285714-6.83571428571429
73114.2118.935714285714-4.73571428571428
74121.1118.9357142857142.16428571428571
75131.6118.93571428571412.6642857142857
76125118.9357142857146.06428571428571
77120.4118.9357142857141.46428571428572
78117.7118.935714285714-1.23571428571428
79117.5118.935714285714-1.43571428571429
80120.6118.9357142857141.66428571428571
81127.5118.9357142857148.56428571428571
82112.3118.935714285714-6.63571428571429
83124.5118.9357142857145.56428571428571
84115.2118.935714285714-3.73571428571428
85105.4118.935714285714-13.5357142857143



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')