Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 03:08:54 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t1195466551cvwsc409lgy3t3t.htm/, Retrieved Fri, 03 May 2024 09:13:20 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5658, Retrieved Fri, 03 May 2024 09:13:20 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact199
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [WS6 tabel1] [2007-11-19 10:08:54] [6bae8369195607c4cbc8a8485fed7b2f] [Current]
Feedback Forum

Post a new message
Dataseries X:
110.40	72.50	0
96.40	59.40	0
101.90	85.70	0
106.20	88.20	0
81.00	62.80	0
94.70	87.00	0
101.00	79.20	0
109.40	112.00	1
102.30	79.20	1
90.70	132.10	1
96.20	40.10	1
96.10	69.00	1
106.00	59.40	1
103.10	73.80	1
102.00	57.40	1
104.70	81.10	1
86.00	46.60	1
92.10	41.40	1
106.90	71.20	1
112.60	67.90	1
101.70	72.00	1
92.00	145.50	1
97.40	39.70	1
97.00	51.90	1
105.40	73.70	1
102.70	70.90	1
98.10	60.80	1
104.50	61.00	1
87.40	54.50	1
89.90	39.10	1
109.80	66.60	1
111.70	58.50	1
98.60	59.80	1
96.90	80.90	1
95.10	37.30	1
97.00	44.60	1
112.70	48.70	1
102.90	54.00	1
97.40	49.50	1
111.40	61.60	1
87.40	35.00	1
96.80	35.70	1
114.10	51.30	1
110.30	49.00	1
103.90	41.50	1
101.60	72.50	1
94.60	42.10	1
95.90	44.10	1
104.70	45.10	1
102.80	50.30	1
98.10	40.90	1
113.90	47.20	1
80.90	36.90	1
95.70	40.90	1
113.20	38.30	1
105.90	46.30	1
108.80	28.40	1
102.30	78.40	1
99.00	36.80	1
100.70	50.70	1
115.50	42.80	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5658&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5658&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5658&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Invest[t] = + 62.826214719616 + 0.137386490692146Tot.prod[t] -19.0234796796854Tijd[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Invest[t] =  +  62.826214719616 +  0.137386490692146Tot.prod[t] -19.0234796796854Tijd[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5658&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Invest[t] =  +  62.826214719616 +  0.137386490692146Tot.prod[t] -19.0234796796854Tijd[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5658&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5658&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Invest[t] = + 62.826214719616 + 0.137386490692146Tot.prod[t] -19.0234796796854Tijd[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)62.82621471961635.7769821.75610.0843590.042179
Tot.prod0.1373864906921460.3522450.390.6979420.348971
Tijd-19.02347967968548.857292-2.14780.0359220.017961

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 62.826214719616 & 35.776982 & 1.7561 & 0.084359 & 0.042179 \tabularnewline
Tot.prod & 0.137386490692146 & 0.352245 & 0.39 & 0.697942 & 0.348971 \tabularnewline
Tijd & -19.0234796796854 & 8.857292 & -2.1478 & 0.035922 & 0.017961 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5658&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]62.826214719616[/C][C]35.776982[/C][C]1.7561[/C][C]0.084359[/C][C]0.042179[/C][/ROW]
[ROW][C]Tot.prod[/C][C]0.137386490692146[/C][C]0.352245[/C][C]0.39[/C][C]0.697942[/C][C]0.348971[/C][/ROW]
[ROW][C]Tijd[/C][C]-19.0234796796854[/C][C]8.857292[/C][C]-2.1478[/C][C]0.035922[/C][C]0.017961[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5658&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5658&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)62.82621471961635.7769821.75610.0843590.042179
Tot.prod0.1373864906921460.3522450.390.6979420.348971
Tijd-19.02347967968548.857292-2.14780.0359220.017961







Multiple Linear Regression - Regression Statistics
Multiple R0.272384919863927
R-squared0.0741935445692779
Adjusted R-squared0.0422691840371839
F-TEST (value)2.32404168267334
F-TEST (DF numerator)2
F-TEST (DF denominator)58
p-value0.106926643940794
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation21.9495379548793
Sum Squared Residuals27943.3685530958

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.272384919863927 \tabularnewline
R-squared & 0.0741935445692779 \tabularnewline
Adjusted R-squared & 0.0422691840371839 \tabularnewline
F-TEST (value) & 2.32404168267334 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 58 \tabularnewline
p-value & 0.106926643940794 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 21.9495379548793 \tabularnewline
Sum Squared Residuals & 27943.3685530958 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5658&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.272384919863927[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0741935445692779[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0422691840371839[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]2.32404168267334[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]58[/C][/ROW]
[ROW][C]p-value[/C][C]0.106926643940794[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]21.9495379548793[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]27943.3685530958[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5658&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5658&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.272384919863927
R-squared0.0741935445692779
Adjusted R-squared0.0422691840371839
F-TEST (value)2.32404168267334
F-TEST (DF numerator)2
F-TEST (DF denominator)58
p-value0.106926643940794
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation21.9495379548793
Sum Squared Residuals27943.3685530958







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
172.577.9936832920288-5.49368329202883
259.476.0702724223388-16.6702724223389
385.776.82589812114578.87410187885435
488.277.416660031121910.7833399688781
562.873.9545204656798-11.1545204656798
68775.836715388162211.1632846118378
779.276.70225027952282.49774972047724
811258.832817121651453.1671828783486
979.257.857373037737121.3426269622629
10132.156.263689745708275.8363102542917
1140.157.019315444515-16.9193154445150
126957.005576795445811.9944232045542
1359.458.36570305329811.03429694670191
1473.857.967282230290915.8327177697091
1557.457.8161570905295-0.416157090529503
1681.158.187100615398322.9128993846017
1746.655.6179732394552-9.01797323945515
1841.456.4560308326772-15.0560308326772
1971.258.48935089492112.7106491050790
2067.959.27245389186638.62754610813375
217257.774941143321914.2250588566781
22145.556.44229218360889.057707816392
2339.757.1841792333456-17.4841792333456
2451.957.1292246370688-5.22922463706877
2573.758.283271158882815.4167288411172
2670.957.91232763401412.987672365986
2760.857.28034977683013.51965022316987
286158.15962331725992.84037668274013
2954.555.8103143264242-1.31031432642416
3039.156.1537805531545-17.0537805531545
3166.658.88777171792827.71222828207175
3258.559.1488060502433-0.648806050243326
3359.857.34904302217622.45095697782380
3480.957.115485987999523.7845140120005
3537.356.8681903047537-19.5681903047537
3644.657.1292246370688-12.5292246370688
3748.759.2861925409355-10.5861925409355
385457.9398049321524-3.93980493215243
3949.557.1841792333456-7.68417923334563
4061.659.10759010303572.49240989696432
413555.8103143264242-20.8103143264242
4235.757.1017473389303-21.4017473389303
4351.359.4785336279045-8.17853362790448
444958.9564649632743-9.95646496327432
4541.558.0771914228446-16.5771914228446
4672.557.761202494252614.7387975057474
4742.156.7994970594076-14.6994970594076
4844.156.9780994973074-12.8780994973074
4945.158.1871006153983-13.0871006153983
5050.357.9260662830832-7.62606628308322
5140.957.2803497768301-16.3803497768301
5247.259.4510563297660-12.2510563297660
5336.954.9173021369252-18.0173021369252
5440.956.950622199169-16.0506221991690
5538.359.3548857862815-21.0548857862816
5646.358.3519644042289-12.0519644042289
5728.458.7503852272361-30.3503852272361
5878.457.857373037737120.5426269622629
5936.857.4039976184531-20.6039976184531
6050.757.6375546526297-6.93755465262971
6142.859.6708747148735-16.8708747148735

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 72.5 & 77.9936832920288 & -5.49368329202883 \tabularnewline
2 & 59.4 & 76.0702724223388 & -16.6702724223389 \tabularnewline
3 & 85.7 & 76.8258981211457 & 8.87410187885435 \tabularnewline
4 & 88.2 & 77.4166600311219 & 10.7833399688781 \tabularnewline
5 & 62.8 & 73.9545204656798 & -11.1545204656798 \tabularnewline
6 & 87 & 75.8367153881622 & 11.1632846118378 \tabularnewline
7 & 79.2 & 76.7022502795228 & 2.49774972047724 \tabularnewline
8 & 112 & 58.8328171216514 & 53.1671828783486 \tabularnewline
9 & 79.2 & 57.8573730377371 & 21.3426269622629 \tabularnewline
10 & 132.1 & 56.2636897457082 & 75.8363102542917 \tabularnewline
11 & 40.1 & 57.019315444515 & -16.9193154445150 \tabularnewline
12 & 69 & 57.0055767954458 & 11.9944232045542 \tabularnewline
13 & 59.4 & 58.3657030532981 & 1.03429694670191 \tabularnewline
14 & 73.8 & 57.9672822302909 & 15.8327177697091 \tabularnewline
15 & 57.4 & 57.8161570905295 & -0.416157090529503 \tabularnewline
16 & 81.1 & 58.1871006153983 & 22.9128993846017 \tabularnewline
17 & 46.6 & 55.6179732394552 & -9.01797323945515 \tabularnewline
18 & 41.4 & 56.4560308326772 & -15.0560308326772 \tabularnewline
19 & 71.2 & 58.489350894921 & 12.7106491050790 \tabularnewline
20 & 67.9 & 59.2724538918663 & 8.62754610813375 \tabularnewline
21 & 72 & 57.7749411433219 & 14.2250588566781 \tabularnewline
22 & 145.5 & 56.442292183608 & 89.057707816392 \tabularnewline
23 & 39.7 & 57.1841792333456 & -17.4841792333456 \tabularnewline
24 & 51.9 & 57.1292246370688 & -5.22922463706877 \tabularnewline
25 & 73.7 & 58.2832711588828 & 15.4167288411172 \tabularnewline
26 & 70.9 & 57.912327634014 & 12.987672365986 \tabularnewline
27 & 60.8 & 57.2803497768301 & 3.51965022316987 \tabularnewline
28 & 61 & 58.1596233172599 & 2.84037668274013 \tabularnewline
29 & 54.5 & 55.8103143264242 & -1.31031432642416 \tabularnewline
30 & 39.1 & 56.1537805531545 & -17.0537805531545 \tabularnewline
31 & 66.6 & 58.8877717179282 & 7.71222828207175 \tabularnewline
32 & 58.5 & 59.1488060502433 & -0.648806050243326 \tabularnewline
33 & 59.8 & 57.3490430221762 & 2.45095697782380 \tabularnewline
34 & 80.9 & 57.1154859879995 & 23.7845140120005 \tabularnewline
35 & 37.3 & 56.8681903047537 & -19.5681903047537 \tabularnewline
36 & 44.6 & 57.1292246370688 & -12.5292246370688 \tabularnewline
37 & 48.7 & 59.2861925409355 & -10.5861925409355 \tabularnewline
38 & 54 & 57.9398049321524 & -3.93980493215243 \tabularnewline
39 & 49.5 & 57.1841792333456 & -7.68417923334563 \tabularnewline
40 & 61.6 & 59.1075901030357 & 2.49240989696432 \tabularnewline
41 & 35 & 55.8103143264242 & -20.8103143264242 \tabularnewline
42 & 35.7 & 57.1017473389303 & -21.4017473389303 \tabularnewline
43 & 51.3 & 59.4785336279045 & -8.17853362790448 \tabularnewline
44 & 49 & 58.9564649632743 & -9.95646496327432 \tabularnewline
45 & 41.5 & 58.0771914228446 & -16.5771914228446 \tabularnewline
46 & 72.5 & 57.7612024942526 & 14.7387975057474 \tabularnewline
47 & 42.1 & 56.7994970594076 & -14.6994970594076 \tabularnewline
48 & 44.1 & 56.9780994973074 & -12.8780994973074 \tabularnewline
49 & 45.1 & 58.1871006153983 & -13.0871006153983 \tabularnewline
50 & 50.3 & 57.9260662830832 & -7.62606628308322 \tabularnewline
51 & 40.9 & 57.2803497768301 & -16.3803497768301 \tabularnewline
52 & 47.2 & 59.4510563297660 & -12.2510563297660 \tabularnewline
53 & 36.9 & 54.9173021369252 & -18.0173021369252 \tabularnewline
54 & 40.9 & 56.950622199169 & -16.0506221991690 \tabularnewline
55 & 38.3 & 59.3548857862815 & -21.0548857862816 \tabularnewline
56 & 46.3 & 58.3519644042289 & -12.0519644042289 \tabularnewline
57 & 28.4 & 58.7503852272361 & -30.3503852272361 \tabularnewline
58 & 78.4 & 57.8573730377371 & 20.5426269622629 \tabularnewline
59 & 36.8 & 57.4039976184531 & -20.6039976184531 \tabularnewline
60 & 50.7 & 57.6375546526297 & -6.93755465262971 \tabularnewline
61 & 42.8 & 59.6708747148735 & -16.8708747148735 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5658&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]72.5[/C][C]77.9936832920288[/C][C]-5.49368329202883[/C][/ROW]
[ROW][C]2[/C][C]59.4[/C][C]76.0702724223388[/C][C]-16.6702724223389[/C][/ROW]
[ROW][C]3[/C][C]85.7[/C][C]76.8258981211457[/C][C]8.87410187885435[/C][/ROW]
[ROW][C]4[/C][C]88.2[/C][C]77.4166600311219[/C][C]10.7833399688781[/C][/ROW]
[ROW][C]5[/C][C]62.8[/C][C]73.9545204656798[/C][C]-11.1545204656798[/C][/ROW]
[ROW][C]6[/C][C]87[/C][C]75.8367153881622[/C][C]11.1632846118378[/C][/ROW]
[ROW][C]7[/C][C]79.2[/C][C]76.7022502795228[/C][C]2.49774972047724[/C][/ROW]
[ROW][C]8[/C][C]112[/C][C]58.8328171216514[/C][C]53.1671828783486[/C][/ROW]
[ROW][C]9[/C][C]79.2[/C][C]57.8573730377371[/C][C]21.3426269622629[/C][/ROW]
[ROW][C]10[/C][C]132.1[/C][C]56.2636897457082[/C][C]75.8363102542917[/C][/ROW]
[ROW][C]11[/C][C]40.1[/C][C]57.019315444515[/C][C]-16.9193154445150[/C][/ROW]
[ROW][C]12[/C][C]69[/C][C]57.0055767954458[/C][C]11.9944232045542[/C][/ROW]
[ROW][C]13[/C][C]59.4[/C][C]58.3657030532981[/C][C]1.03429694670191[/C][/ROW]
[ROW][C]14[/C][C]73.8[/C][C]57.9672822302909[/C][C]15.8327177697091[/C][/ROW]
[ROW][C]15[/C][C]57.4[/C][C]57.8161570905295[/C][C]-0.416157090529503[/C][/ROW]
[ROW][C]16[/C][C]81.1[/C][C]58.1871006153983[/C][C]22.9128993846017[/C][/ROW]
[ROW][C]17[/C][C]46.6[/C][C]55.6179732394552[/C][C]-9.01797323945515[/C][/ROW]
[ROW][C]18[/C][C]41.4[/C][C]56.4560308326772[/C][C]-15.0560308326772[/C][/ROW]
[ROW][C]19[/C][C]71.2[/C][C]58.489350894921[/C][C]12.7106491050790[/C][/ROW]
[ROW][C]20[/C][C]67.9[/C][C]59.2724538918663[/C][C]8.62754610813375[/C][/ROW]
[ROW][C]21[/C][C]72[/C][C]57.7749411433219[/C][C]14.2250588566781[/C][/ROW]
[ROW][C]22[/C][C]145.5[/C][C]56.442292183608[/C][C]89.057707816392[/C][/ROW]
[ROW][C]23[/C][C]39.7[/C][C]57.1841792333456[/C][C]-17.4841792333456[/C][/ROW]
[ROW][C]24[/C][C]51.9[/C][C]57.1292246370688[/C][C]-5.22922463706877[/C][/ROW]
[ROW][C]25[/C][C]73.7[/C][C]58.2832711588828[/C][C]15.4167288411172[/C][/ROW]
[ROW][C]26[/C][C]70.9[/C][C]57.912327634014[/C][C]12.987672365986[/C][/ROW]
[ROW][C]27[/C][C]60.8[/C][C]57.2803497768301[/C][C]3.51965022316987[/C][/ROW]
[ROW][C]28[/C][C]61[/C][C]58.1596233172599[/C][C]2.84037668274013[/C][/ROW]
[ROW][C]29[/C][C]54.5[/C][C]55.8103143264242[/C][C]-1.31031432642416[/C][/ROW]
[ROW][C]30[/C][C]39.1[/C][C]56.1537805531545[/C][C]-17.0537805531545[/C][/ROW]
[ROW][C]31[/C][C]66.6[/C][C]58.8877717179282[/C][C]7.71222828207175[/C][/ROW]
[ROW][C]32[/C][C]58.5[/C][C]59.1488060502433[/C][C]-0.648806050243326[/C][/ROW]
[ROW][C]33[/C][C]59.8[/C][C]57.3490430221762[/C][C]2.45095697782380[/C][/ROW]
[ROW][C]34[/C][C]80.9[/C][C]57.1154859879995[/C][C]23.7845140120005[/C][/ROW]
[ROW][C]35[/C][C]37.3[/C][C]56.8681903047537[/C][C]-19.5681903047537[/C][/ROW]
[ROW][C]36[/C][C]44.6[/C][C]57.1292246370688[/C][C]-12.5292246370688[/C][/ROW]
[ROW][C]37[/C][C]48.7[/C][C]59.2861925409355[/C][C]-10.5861925409355[/C][/ROW]
[ROW][C]38[/C][C]54[/C][C]57.9398049321524[/C][C]-3.93980493215243[/C][/ROW]
[ROW][C]39[/C][C]49.5[/C][C]57.1841792333456[/C][C]-7.68417923334563[/C][/ROW]
[ROW][C]40[/C][C]61.6[/C][C]59.1075901030357[/C][C]2.49240989696432[/C][/ROW]
[ROW][C]41[/C][C]35[/C][C]55.8103143264242[/C][C]-20.8103143264242[/C][/ROW]
[ROW][C]42[/C][C]35.7[/C][C]57.1017473389303[/C][C]-21.4017473389303[/C][/ROW]
[ROW][C]43[/C][C]51.3[/C][C]59.4785336279045[/C][C]-8.17853362790448[/C][/ROW]
[ROW][C]44[/C][C]49[/C][C]58.9564649632743[/C][C]-9.95646496327432[/C][/ROW]
[ROW][C]45[/C][C]41.5[/C][C]58.0771914228446[/C][C]-16.5771914228446[/C][/ROW]
[ROW][C]46[/C][C]72.5[/C][C]57.7612024942526[/C][C]14.7387975057474[/C][/ROW]
[ROW][C]47[/C][C]42.1[/C][C]56.7994970594076[/C][C]-14.6994970594076[/C][/ROW]
[ROW][C]48[/C][C]44.1[/C][C]56.9780994973074[/C][C]-12.8780994973074[/C][/ROW]
[ROW][C]49[/C][C]45.1[/C][C]58.1871006153983[/C][C]-13.0871006153983[/C][/ROW]
[ROW][C]50[/C][C]50.3[/C][C]57.9260662830832[/C][C]-7.62606628308322[/C][/ROW]
[ROW][C]51[/C][C]40.9[/C][C]57.2803497768301[/C][C]-16.3803497768301[/C][/ROW]
[ROW][C]52[/C][C]47.2[/C][C]59.4510563297660[/C][C]-12.2510563297660[/C][/ROW]
[ROW][C]53[/C][C]36.9[/C][C]54.9173021369252[/C][C]-18.0173021369252[/C][/ROW]
[ROW][C]54[/C][C]40.9[/C][C]56.950622199169[/C][C]-16.0506221991690[/C][/ROW]
[ROW][C]55[/C][C]38.3[/C][C]59.3548857862815[/C][C]-21.0548857862816[/C][/ROW]
[ROW][C]56[/C][C]46.3[/C][C]58.3519644042289[/C][C]-12.0519644042289[/C][/ROW]
[ROW][C]57[/C][C]28.4[/C][C]58.7503852272361[/C][C]-30.3503852272361[/C][/ROW]
[ROW][C]58[/C][C]78.4[/C][C]57.8573730377371[/C][C]20.5426269622629[/C][/ROW]
[ROW][C]59[/C][C]36.8[/C][C]57.4039976184531[/C][C]-20.6039976184531[/C][/ROW]
[ROW][C]60[/C][C]50.7[/C][C]57.6375546526297[/C][C]-6.93755465262971[/C][/ROW]
[ROW][C]61[/C][C]42.8[/C][C]59.6708747148735[/C][C]-16.8708747148735[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5658&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5658&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
172.577.9936832920288-5.49368329202883
259.476.0702724223388-16.6702724223389
385.776.82589812114578.87410187885435
488.277.416660031121910.7833399688781
562.873.9545204656798-11.1545204656798
68775.836715388162211.1632846118378
779.276.70225027952282.49774972047724
811258.832817121651453.1671828783486
979.257.857373037737121.3426269622629
10132.156.263689745708275.8363102542917
1140.157.019315444515-16.9193154445150
126957.005576795445811.9944232045542
1359.458.36570305329811.03429694670191
1473.857.967282230290915.8327177697091
1557.457.8161570905295-0.416157090529503
1681.158.187100615398322.9128993846017
1746.655.6179732394552-9.01797323945515
1841.456.4560308326772-15.0560308326772
1971.258.48935089492112.7106491050790
2067.959.27245389186638.62754610813375
217257.774941143321914.2250588566781
22145.556.44229218360889.057707816392
2339.757.1841792333456-17.4841792333456
2451.957.1292246370688-5.22922463706877
2573.758.283271158882815.4167288411172
2670.957.91232763401412.987672365986
2760.857.28034977683013.51965022316987
286158.15962331725992.84037668274013
2954.555.8103143264242-1.31031432642416
3039.156.1537805531545-17.0537805531545
3166.658.88777171792827.71222828207175
3258.559.1488060502433-0.648806050243326
3359.857.34904302217622.45095697782380
3480.957.115485987999523.7845140120005
3537.356.8681903047537-19.5681903047537
3644.657.1292246370688-12.5292246370688
3748.759.2861925409355-10.5861925409355
385457.9398049321524-3.93980493215243
3949.557.1841792333456-7.68417923334563
4061.659.10759010303572.49240989696432
413555.8103143264242-20.8103143264242
4235.757.1017473389303-21.4017473389303
4351.359.4785336279045-8.17853362790448
444958.9564649632743-9.95646496327432
4541.558.0771914228446-16.5771914228446
4672.557.761202494252614.7387975057474
4742.156.7994970594076-14.6994970594076
4844.156.9780994973074-12.8780994973074
4945.158.1871006153983-13.0871006153983
5050.357.9260662830832-7.62606628308322
5140.957.2803497768301-16.3803497768301
5247.259.4510563297660-12.2510563297660
5336.954.9173021369252-18.0173021369252
5440.956.950622199169-16.0506221991690
5538.359.3548857862815-21.0548857862816
5646.358.3519644042289-12.0519644042289
5728.458.7503852272361-30.3503852272361
5878.457.857373037737120.5426269622629
5936.857.4039976184531-20.6039976184531
6050.757.6375546526297-6.93755465262971
6142.859.6708747148735-16.8708747148735



Parameters (Session):
par1 = 2 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 2 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')