Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 14 Dec 2007 02:59:07 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Dec/14/t11976255954sj2wpqg803vpfn.htm/, Retrieved Thu, 02 May 2024 20:41:09 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=3783, Retrieved Thu, 02 May 2024 20:41:09 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordss0650921, s0650125
Estimated Impact209
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [paper_regressie_m...] [2007-12-14 09:59:07] [1232d415564adb2a600743f77b12553a] [Current]
Feedback Forum

Post a new message
Dataseries X:
102.7	0
103.2	0
105.6	0
103.9	0
107.2	0
100.7	0
92.1	0
90.3	0
93.4	0
98.5	0
100.8	0
102.3	0
104.7	0
101.1	0
101.4	0
99.5	0
98.4	0
96.3	0
100.7	0
101.2	0
100.3	0
97.8	0
97.4	0
98.6	0
99.7	0
99.0	0
98.1	0
97.0	0
98.5	0
103.8	0
114.4	0
124.5	0
134.2	0
131.8	0
125.6	0
119.9	0
114.9	0
115.5	0
112.5	0
111.4	0
115.3	0
110.8	0
103.7	0
111.1	0
113.0	0
111.2	0
117.6	0
121.7	0
127.3	0
129.8	0
137.1	0
141.4	0
137.4	0
130.7	0
117.2	0
110.8	0
111.4	0
108.2	0
108.8	0
110.2	0
109.5	1
109.5	1
116.0	1
111.2	1
112.1	1
114.0	1
119.1	1
114.1	1
115.1	1
115.4	1
110.8	1
116.0	1
119.2	1
126.5	1
127.8	1
131.3	1
140.3	1
137.3	1
143.0	1
134.5	1
139.9	1
159.3	1
170.4	1
175.0	1
175.8	1
180.9	1
180.3	1
169.6	1
172.3	1
184.8	1
177.7	1
184.6	1
211.4	1
215.3	1
215.9	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3783&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3783&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3783&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
graanprijs[t] = + 109.56 + 37.18ontkoppelde_bedrijfstoeslag[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
graanprijs[t] =  +  109.56 +  37.18ontkoppelde_bedrijfstoeslag[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3783&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]graanprijs[t] =  +  109.56 +  37.18ontkoppelde_bedrijfstoeslag[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3783&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3783&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
graanprijs[t] = + 109.56 + 37.18ontkoppelde_bedrijfstoeslag[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)109.562.92926837.401800
ontkoppelde_bedrijfstoeslag37.184.8259957.704100

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 109.56 & 2.929268 & 37.4018 & 0 & 0 \tabularnewline
ontkoppelde_bedrijfstoeslag & 37.18 & 4.825995 & 7.7041 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3783&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]109.56[/C][C]2.929268[/C][C]37.4018[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]ontkoppelde_bedrijfstoeslag[/C][C]37.18[/C][C]4.825995[/C][C]7.7041[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3783&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3783&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)109.562.92926837.401800
ontkoppelde_bedrijfstoeslag37.184.8259957.704100







Multiple Linear Regression - Regression Statistics
Multiple R0.624160879065545
R-squared0.389576802955874
Adjusted R-squared0.383013112665077
F-TEST (value)59.3533188947229
F-TEST (DF numerator)1
F-TEST (DF denominator)93
p-value1.40016886973626e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation22.6900120132469
Sum Squared Residuals47879.808

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.624160879065545 \tabularnewline
R-squared & 0.389576802955874 \tabularnewline
Adjusted R-squared & 0.383013112665077 \tabularnewline
F-TEST (value) & 59.3533188947229 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 93 \tabularnewline
p-value & 1.40016886973626e-11 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 22.6900120132469 \tabularnewline
Sum Squared Residuals & 47879.808 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3783&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.624160879065545[/C][/ROW]
[ROW][C]R-squared[/C][C]0.389576802955874[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.383013112665077[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]59.3533188947229[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]93[/C][/ROW]
[ROW][C]p-value[/C][C]1.40016886973626e-11[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]22.6900120132469[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]47879.808[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3783&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3783&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.624160879065545
R-squared0.389576802955874
Adjusted R-squared0.383013112665077
F-TEST (value)59.3533188947229
F-TEST (DF numerator)1
F-TEST (DF denominator)93
p-value1.40016886973626e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation22.6900120132469
Sum Squared Residuals47879.808







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1102.7109.560000000000-6.8600000000002
2103.2109.56-6.36000000000004
3105.6109.56-3.96
4103.9109.56-5.65999999999999
5107.2109.56-2.35999999999999
6100.7109.56-8.86
792.1109.56-17.46
890.3109.56-19.26
993.4109.56-16.16
1098.5109.56-11.06
11100.8109.56-8.76
12102.3109.56-7.26
13104.7109.56-4.86
14101.1109.56-8.46
15101.4109.56-8.16
1699.5109.56-10.06
1798.4109.56-11.16
1896.3109.56-13.26
19100.7109.56-8.86
20101.2109.56-8.36
21100.3109.56-9.26
2297.8109.56-11.76
2397.4109.56-12.16
2498.6109.56-10.96
2599.7109.56-9.86
2699109.56-10.56
2798.1109.56-11.46
2897109.56-12.56
2998.5109.56-11.06
30103.8109.56-5.76
31114.4109.564.84000000000001
32124.5109.5614.94
33134.2109.5624.64
34131.8109.5622.24
35125.6109.5616.04
36119.9109.5610.34
37114.9109.565.34000000000001
38115.5109.565.94
39112.5109.562.94
40111.4109.561.84000000000001
41115.3109.565.74
42110.8109.561.24
43103.7109.56-5.86
44111.1109.561.54000000000000
45113109.563.44
46111.2109.561.64000000000000
47117.6109.568.04
48121.7109.5612.14
49127.3109.5617.74
50129.8109.5620.24
51137.1109.5627.54
52141.4109.5631.84
53137.4109.5627.84
54130.7109.5621.14
55117.2109.567.64
56110.8109.561.24
57111.4109.561.84000000000001
58108.2109.56-1.35999999999999
59108.8109.56-0.76
60110.2109.560.640000000000005
61109.5146.74-37.24
62109.5146.74-37.24
63116146.74-30.74
64111.2146.74-35.54
65112.1146.74-34.64
66114146.74-32.74
67119.1146.74-27.64
68114.1146.74-32.64
69115.1146.74-31.64
70115.4146.74-31.34
71110.8146.74-35.94
72116146.74-30.74
73119.2146.74-27.54
74126.5146.74-20.24
75127.8146.74-18.94
76131.3146.74-15.44
77140.3146.74-6.43999999999999
78137.3146.74-9.44
79143146.74-3.74
80134.5146.74-12.24
81139.9146.74-6.84
82159.3146.7412.56
83170.4146.7423.66
84175146.7428.26
85175.8146.7429.06
86180.9146.7434.16
87180.3146.7433.56
88169.6146.7422.86
89172.3146.7425.56
90184.8146.7438.06
91177.7146.7430.96
92184.6146.7437.86
93211.4146.7464.66
94215.3146.7468.56
95215.9146.7469.16

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 102.7 & 109.560000000000 & -6.8600000000002 \tabularnewline
2 & 103.2 & 109.56 & -6.36000000000004 \tabularnewline
3 & 105.6 & 109.56 & -3.96 \tabularnewline
4 & 103.9 & 109.56 & -5.65999999999999 \tabularnewline
5 & 107.2 & 109.56 & -2.35999999999999 \tabularnewline
6 & 100.7 & 109.56 & -8.86 \tabularnewline
7 & 92.1 & 109.56 & -17.46 \tabularnewline
8 & 90.3 & 109.56 & -19.26 \tabularnewline
9 & 93.4 & 109.56 & -16.16 \tabularnewline
10 & 98.5 & 109.56 & -11.06 \tabularnewline
11 & 100.8 & 109.56 & -8.76 \tabularnewline
12 & 102.3 & 109.56 & -7.26 \tabularnewline
13 & 104.7 & 109.56 & -4.86 \tabularnewline
14 & 101.1 & 109.56 & -8.46 \tabularnewline
15 & 101.4 & 109.56 & -8.16 \tabularnewline
16 & 99.5 & 109.56 & -10.06 \tabularnewline
17 & 98.4 & 109.56 & -11.16 \tabularnewline
18 & 96.3 & 109.56 & -13.26 \tabularnewline
19 & 100.7 & 109.56 & -8.86 \tabularnewline
20 & 101.2 & 109.56 & -8.36 \tabularnewline
21 & 100.3 & 109.56 & -9.26 \tabularnewline
22 & 97.8 & 109.56 & -11.76 \tabularnewline
23 & 97.4 & 109.56 & -12.16 \tabularnewline
24 & 98.6 & 109.56 & -10.96 \tabularnewline
25 & 99.7 & 109.56 & -9.86 \tabularnewline
26 & 99 & 109.56 & -10.56 \tabularnewline
27 & 98.1 & 109.56 & -11.46 \tabularnewline
28 & 97 & 109.56 & -12.56 \tabularnewline
29 & 98.5 & 109.56 & -11.06 \tabularnewline
30 & 103.8 & 109.56 & -5.76 \tabularnewline
31 & 114.4 & 109.56 & 4.84000000000001 \tabularnewline
32 & 124.5 & 109.56 & 14.94 \tabularnewline
33 & 134.2 & 109.56 & 24.64 \tabularnewline
34 & 131.8 & 109.56 & 22.24 \tabularnewline
35 & 125.6 & 109.56 & 16.04 \tabularnewline
36 & 119.9 & 109.56 & 10.34 \tabularnewline
37 & 114.9 & 109.56 & 5.34000000000001 \tabularnewline
38 & 115.5 & 109.56 & 5.94 \tabularnewline
39 & 112.5 & 109.56 & 2.94 \tabularnewline
40 & 111.4 & 109.56 & 1.84000000000001 \tabularnewline
41 & 115.3 & 109.56 & 5.74 \tabularnewline
42 & 110.8 & 109.56 & 1.24 \tabularnewline
43 & 103.7 & 109.56 & -5.86 \tabularnewline
44 & 111.1 & 109.56 & 1.54000000000000 \tabularnewline
45 & 113 & 109.56 & 3.44 \tabularnewline
46 & 111.2 & 109.56 & 1.64000000000000 \tabularnewline
47 & 117.6 & 109.56 & 8.04 \tabularnewline
48 & 121.7 & 109.56 & 12.14 \tabularnewline
49 & 127.3 & 109.56 & 17.74 \tabularnewline
50 & 129.8 & 109.56 & 20.24 \tabularnewline
51 & 137.1 & 109.56 & 27.54 \tabularnewline
52 & 141.4 & 109.56 & 31.84 \tabularnewline
53 & 137.4 & 109.56 & 27.84 \tabularnewline
54 & 130.7 & 109.56 & 21.14 \tabularnewline
55 & 117.2 & 109.56 & 7.64 \tabularnewline
56 & 110.8 & 109.56 & 1.24 \tabularnewline
57 & 111.4 & 109.56 & 1.84000000000001 \tabularnewline
58 & 108.2 & 109.56 & -1.35999999999999 \tabularnewline
59 & 108.8 & 109.56 & -0.76 \tabularnewline
60 & 110.2 & 109.56 & 0.640000000000005 \tabularnewline
61 & 109.5 & 146.74 & -37.24 \tabularnewline
62 & 109.5 & 146.74 & -37.24 \tabularnewline
63 & 116 & 146.74 & -30.74 \tabularnewline
64 & 111.2 & 146.74 & -35.54 \tabularnewline
65 & 112.1 & 146.74 & -34.64 \tabularnewline
66 & 114 & 146.74 & -32.74 \tabularnewline
67 & 119.1 & 146.74 & -27.64 \tabularnewline
68 & 114.1 & 146.74 & -32.64 \tabularnewline
69 & 115.1 & 146.74 & -31.64 \tabularnewline
70 & 115.4 & 146.74 & -31.34 \tabularnewline
71 & 110.8 & 146.74 & -35.94 \tabularnewline
72 & 116 & 146.74 & -30.74 \tabularnewline
73 & 119.2 & 146.74 & -27.54 \tabularnewline
74 & 126.5 & 146.74 & -20.24 \tabularnewline
75 & 127.8 & 146.74 & -18.94 \tabularnewline
76 & 131.3 & 146.74 & -15.44 \tabularnewline
77 & 140.3 & 146.74 & -6.43999999999999 \tabularnewline
78 & 137.3 & 146.74 & -9.44 \tabularnewline
79 & 143 & 146.74 & -3.74 \tabularnewline
80 & 134.5 & 146.74 & -12.24 \tabularnewline
81 & 139.9 & 146.74 & -6.84 \tabularnewline
82 & 159.3 & 146.74 & 12.56 \tabularnewline
83 & 170.4 & 146.74 & 23.66 \tabularnewline
84 & 175 & 146.74 & 28.26 \tabularnewline
85 & 175.8 & 146.74 & 29.06 \tabularnewline
86 & 180.9 & 146.74 & 34.16 \tabularnewline
87 & 180.3 & 146.74 & 33.56 \tabularnewline
88 & 169.6 & 146.74 & 22.86 \tabularnewline
89 & 172.3 & 146.74 & 25.56 \tabularnewline
90 & 184.8 & 146.74 & 38.06 \tabularnewline
91 & 177.7 & 146.74 & 30.96 \tabularnewline
92 & 184.6 & 146.74 & 37.86 \tabularnewline
93 & 211.4 & 146.74 & 64.66 \tabularnewline
94 & 215.3 & 146.74 & 68.56 \tabularnewline
95 & 215.9 & 146.74 & 69.16 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3783&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]102.7[/C][C]109.560000000000[/C][C]-6.8600000000002[/C][/ROW]
[ROW][C]2[/C][C]103.2[/C][C]109.56[/C][C]-6.36000000000004[/C][/ROW]
[ROW][C]3[/C][C]105.6[/C][C]109.56[/C][C]-3.96[/C][/ROW]
[ROW][C]4[/C][C]103.9[/C][C]109.56[/C][C]-5.65999999999999[/C][/ROW]
[ROW][C]5[/C][C]107.2[/C][C]109.56[/C][C]-2.35999999999999[/C][/ROW]
[ROW][C]6[/C][C]100.7[/C][C]109.56[/C][C]-8.86[/C][/ROW]
[ROW][C]7[/C][C]92.1[/C][C]109.56[/C][C]-17.46[/C][/ROW]
[ROW][C]8[/C][C]90.3[/C][C]109.56[/C][C]-19.26[/C][/ROW]
[ROW][C]9[/C][C]93.4[/C][C]109.56[/C][C]-16.16[/C][/ROW]
[ROW][C]10[/C][C]98.5[/C][C]109.56[/C][C]-11.06[/C][/ROW]
[ROW][C]11[/C][C]100.8[/C][C]109.56[/C][C]-8.76[/C][/ROW]
[ROW][C]12[/C][C]102.3[/C][C]109.56[/C][C]-7.26[/C][/ROW]
[ROW][C]13[/C][C]104.7[/C][C]109.56[/C][C]-4.86[/C][/ROW]
[ROW][C]14[/C][C]101.1[/C][C]109.56[/C][C]-8.46[/C][/ROW]
[ROW][C]15[/C][C]101.4[/C][C]109.56[/C][C]-8.16[/C][/ROW]
[ROW][C]16[/C][C]99.5[/C][C]109.56[/C][C]-10.06[/C][/ROW]
[ROW][C]17[/C][C]98.4[/C][C]109.56[/C][C]-11.16[/C][/ROW]
[ROW][C]18[/C][C]96.3[/C][C]109.56[/C][C]-13.26[/C][/ROW]
[ROW][C]19[/C][C]100.7[/C][C]109.56[/C][C]-8.86[/C][/ROW]
[ROW][C]20[/C][C]101.2[/C][C]109.56[/C][C]-8.36[/C][/ROW]
[ROW][C]21[/C][C]100.3[/C][C]109.56[/C][C]-9.26[/C][/ROW]
[ROW][C]22[/C][C]97.8[/C][C]109.56[/C][C]-11.76[/C][/ROW]
[ROW][C]23[/C][C]97.4[/C][C]109.56[/C][C]-12.16[/C][/ROW]
[ROW][C]24[/C][C]98.6[/C][C]109.56[/C][C]-10.96[/C][/ROW]
[ROW][C]25[/C][C]99.7[/C][C]109.56[/C][C]-9.86[/C][/ROW]
[ROW][C]26[/C][C]99[/C][C]109.56[/C][C]-10.56[/C][/ROW]
[ROW][C]27[/C][C]98.1[/C][C]109.56[/C][C]-11.46[/C][/ROW]
[ROW][C]28[/C][C]97[/C][C]109.56[/C][C]-12.56[/C][/ROW]
[ROW][C]29[/C][C]98.5[/C][C]109.56[/C][C]-11.06[/C][/ROW]
[ROW][C]30[/C][C]103.8[/C][C]109.56[/C][C]-5.76[/C][/ROW]
[ROW][C]31[/C][C]114.4[/C][C]109.56[/C][C]4.84000000000001[/C][/ROW]
[ROW][C]32[/C][C]124.5[/C][C]109.56[/C][C]14.94[/C][/ROW]
[ROW][C]33[/C][C]134.2[/C][C]109.56[/C][C]24.64[/C][/ROW]
[ROW][C]34[/C][C]131.8[/C][C]109.56[/C][C]22.24[/C][/ROW]
[ROW][C]35[/C][C]125.6[/C][C]109.56[/C][C]16.04[/C][/ROW]
[ROW][C]36[/C][C]119.9[/C][C]109.56[/C][C]10.34[/C][/ROW]
[ROW][C]37[/C][C]114.9[/C][C]109.56[/C][C]5.34000000000001[/C][/ROW]
[ROW][C]38[/C][C]115.5[/C][C]109.56[/C][C]5.94[/C][/ROW]
[ROW][C]39[/C][C]112.5[/C][C]109.56[/C][C]2.94[/C][/ROW]
[ROW][C]40[/C][C]111.4[/C][C]109.56[/C][C]1.84000000000001[/C][/ROW]
[ROW][C]41[/C][C]115.3[/C][C]109.56[/C][C]5.74[/C][/ROW]
[ROW][C]42[/C][C]110.8[/C][C]109.56[/C][C]1.24[/C][/ROW]
[ROW][C]43[/C][C]103.7[/C][C]109.56[/C][C]-5.86[/C][/ROW]
[ROW][C]44[/C][C]111.1[/C][C]109.56[/C][C]1.54000000000000[/C][/ROW]
[ROW][C]45[/C][C]113[/C][C]109.56[/C][C]3.44[/C][/ROW]
[ROW][C]46[/C][C]111.2[/C][C]109.56[/C][C]1.64000000000000[/C][/ROW]
[ROW][C]47[/C][C]117.6[/C][C]109.56[/C][C]8.04[/C][/ROW]
[ROW][C]48[/C][C]121.7[/C][C]109.56[/C][C]12.14[/C][/ROW]
[ROW][C]49[/C][C]127.3[/C][C]109.56[/C][C]17.74[/C][/ROW]
[ROW][C]50[/C][C]129.8[/C][C]109.56[/C][C]20.24[/C][/ROW]
[ROW][C]51[/C][C]137.1[/C][C]109.56[/C][C]27.54[/C][/ROW]
[ROW][C]52[/C][C]141.4[/C][C]109.56[/C][C]31.84[/C][/ROW]
[ROW][C]53[/C][C]137.4[/C][C]109.56[/C][C]27.84[/C][/ROW]
[ROW][C]54[/C][C]130.7[/C][C]109.56[/C][C]21.14[/C][/ROW]
[ROW][C]55[/C][C]117.2[/C][C]109.56[/C][C]7.64[/C][/ROW]
[ROW][C]56[/C][C]110.8[/C][C]109.56[/C][C]1.24[/C][/ROW]
[ROW][C]57[/C][C]111.4[/C][C]109.56[/C][C]1.84000000000001[/C][/ROW]
[ROW][C]58[/C][C]108.2[/C][C]109.56[/C][C]-1.35999999999999[/C][/ROW]
[ROW][C]59[/C][C]108.8[/C][C]109.56[/C][C]-0.76[/C][/ROW]
[ROW][C]60[/C][C]110.2[/C][C]109.56[/C][C]0.640000000000005[/C][/ROW]
[ROW][C]61[/C][C]109.5[/C][C]146.74[/C][C]-37.24[/C][/ROW]
[ROW][C]62[/C][C]109.5[/C][C]146.74[/C][C]-37.24[/C][/ROW]
[ROW][C]63[/C][C]116[/C][C]146.74[/C][C]-30.74[/C][/ROW]
[ROW][C]64[/C][C]111.2[/C][C]146.74[/C][C]-35.54[/C][/ROW]
[ROW][C]65[/C][C]112.1[/C][C]146.74[/C][C]-34.64[/C][/ROW]
[ROW][C]66[/C][C]114[/C][C]146.74[/C][C]-32.74[/C][/ROW]
[ROW][C]67[/C][C]119.1[/C][C]146.74[/C][C]-27.64[/C][/ROW]
[ROW][C]68[/C][C]114.1[/C][C]146.74[/C][C]-32.64[/C][/ROW]
[ROW][C]69[/C][C]115.1[/C][C]146.74[/C][C]-31.64[/C][/ROW]
[ROW][C]70[/C][C]115.4[/C][C]146.74[/C][C]-31.34[/C][/ROW]
[ROW][C]71[/C][C]110.8[/C][C]146.74[/C][C]-35.94[/C][/ROW]
[ROW][C]72[/C][C]116[/C][C]146.74[/C][C]-30.74[/C][/ROW]
[ROW][C]73[/C][C]119.2[/C][C]146.74[/C][C]-27.54[/C][/ROW]
[ROW][C]74[/C][C]126.5[/C][C]146.74[/C][C]-20.24[/C][/ROW]
[ROW][C]75[/C][C]127.8[/C][C]146.74[/C][C]-18.94[/C][/ROW]
[ROW][C]76[/C][C]131.3[/C][C]146.74[/C][C]-15.44[/C][/ROW]
[ROW][C]77[/C][C]140.3[/C][C]146.74[/C][C]-6.43999999999999[/C][/ROW]
[ROW][C]78[/C][C]137.3[/C][C]146.74[/C][C]-9.44[/C][/ROW]
[ROW][C]79[/C][C]143[/C][C]146.74[/C][C]-3.74[/C][/ROW]
[ROW][C]80[/C][C]134.5[/C][C]146.74[/C][C]-12.24[/C][/ROW]
[ROW][C]81[/C][C]139.9[/C][C]146.74[/C][C]-6.84[/C][/ROW]
[ROW][C]82[/C][C]159.3[/C][C]146.74[/C][C]12.56[/C][/ROW]
[ROW][C]83[/C][C]170.4[/C][C]146.74[/C][C]23.66[/C][/ROW]
[ROW][C]84[/C][C]175[/C][C]146.74[/C][C]28.26[/C][/ROW]
[ROW][C]85[/C][C]175.8[/C][C]146.74[/C][C]29.06[/C][/ROW]
[ROW][C]86[/C][C]180.9[/C][C]146.74[/C][C]34.16[/C][/ROW]
[ROW][C]87[/C][C]180.3[/C][C]146.74[/C][C]33.56[/C][/ROW]
[ROW][C]88[/C][C]169.6[/C][C]146.74[/C][C]22.86[/C][/ROW]
[ROW][C]89[/C][C]172.3[/C][C]146.74[/C][C]25.56[/C][/ROW]
[ROW][C]90[/C][C]184.8[/C][C]146.74[/C][C]38.06[/C][/ROW]
[ROW][C]91[/C][C]177.7[/C][C]146.74[/C][C]30.96[/C][/ROW]
[ROW][C]92[/C][C]184.6[/C][C]146.74[/C][C]37.86[/C][/ROW]
[ROW][C]93[/C][C]211.4[/C][C]146.74[/C][C]64.66[/C][/ROW]
[ROW][C]94[/C][C]215.3[/C][C]146.74[/C][C]68.56[/C][/ROW]
[ROW][C]95[/C][C]215.9[/C][C]146.74[/C][C]69.16[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3783&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3783&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1102.7109.560000000000-6.8600000000002
2103.2109.56-6.36000000000004
3105.6109.56-3.96
4103.9109.56-5.65999999999999
5107.2109.56-2.35999999999999
6100.7109.56-8.86
792.1109.56-17.46
890.3109.56-19.26
993.4109.56-16.16
1098.5109.56-11.06
11100.8109.56-8.76
12102.3109.56-7.26
13104.7109.56-4.86
14101.1109.56-8.46
15101.4109.56-8.16
1699.5109.56-10.06
1798.4109.56-11.16
1896.3109.56-13.26
19100.7109.56-8.86
20101.2109.56-8.36
21100.3109.56-9.26
2297.8109.56-11.76
2397.4109.56-12.16
2498.6109.56-10.96
2599.7109.56-9.86
2699109.56-10.56
2798.1109.56-11.46
2897109.56-12.56
2998.5109.56-11.06
30103.8109.56-5.76
31114.4109.564.84000000000001
32124.5109.5614.94
33134.2109.5624.64
34131.8109.5622.24
35125.6109.5616.04
36119.9109.5610.34
37114.9109.565.34000000000001
38115.5109.565.94
39112.5109.562.94
40111.4109.561.84000000000001
41115.3109.565.74
42110.8109.561.24
43103.7109.56-5.86
44111.1109.561.54000000000000
45113109.563.44
46111.2109.561.64000000000000
47117.6109.568.04
48121.7109.5612.14
49127.3109.5617.74
50129.8109.5620.24
51137.1109.5627.54
52141.4109.5631.84
53137.4109.5627.84
54130.7109.5621.14
55117.2109.567.64
56110.8109.561.24
57111.4109.561.84000000000001
58108.2109.56-1.35999999999999
59108.8109.56-0.76
60110.2109.560.640000000000005
61109.5146.74-37.24
62109.5146.74-37.24
63116146.74-30.74
64111.2146.74-35.54
65112.1146.74-34.64
66114146.74-32.74
67119.1146.74-27.64
68114.1146.74-32.64
69115.1146.74-31.64
70115.4146.74-31.34
71110.8146.74-35.94
72116146.74-30.74
73119.2146.74-27.54
74126.5146.74-20.24
75127.8146.74-18.94
76131.3146.74-15.44
77140.3146.74-6.43999999999999
78137.3146.74-9.44
79143146.74-3.74
80134.5146.74-12.24
81139.9146.74-6.84
82159.3146.7412.56
83170.4146.7423.66
84175146.7428.26
85175.8146.7429.06
86180.9146.7434.16
87180.3146.7433.56
88169.6146.7422.86
89172.3146.7425.56
90184.8146.7438.06
91177.7146.7430.96
92184.6146.7437.86
93211.4146.7464.66
94215.3146.7468.56
95215.9146.7469.16



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')