Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 12:15:19 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t119549932089f7xstp00lwhjv.htm/, Retrieved Fri, 03 May 2024 08:17:19 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5766, Retrieved Fri, 03 May 2024 08:17:19 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact210
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [] [2007-11-19 11:02:03] [707caee747ef4de21385d065f80305f0]
-   PD    [Multiple Regression] [lineaire trend] [2007-11-19 19:15:19] [1dd4e56f2879fe34e71a5ad240ab3149] [Current]
Feedback Forum

Post a new message
Dataseries X:
106.54	107.89	1
106.44	107.26	1
106.57	107.76	1
106.12	107.32	1
106.13	107.15	1
106.26	108.04	1
105.78	106.52	1
105.77	106.62	0
105.2	106.47	0
105.15	105.46	0
105.01	106.13	0
104.75	105.15	0
104.96	105.39	0
105.26	104.57	0
105.13	104.29	0
104.77	104.09	0
104.79	104.51	0
104.4	103.39	0
103.89	102.71	0
103.93	102.62	0
103.48	101.94	0
103.45	101.65	0
103.47	101.86	0
103.5	101.27	0
103.69	101.21	0
103.57	102.15	0
103.47	102.07	0
102.85	102.8	0
102.54	103.39	0
102.39	102.71	0
102.16	102.65	0
101.51	101.12	0
100.83	100.29	0
100.55	99.79	0
100.88	100.11	0
101	99.76	0
100.51	99.96	0
100.44	99.98	0
100.32	100.49	0
99.98	100.75	0
100.03	100.84	0
99.64	100.44	0
99.11	99.57	0
98.97	99.22	0
98.6	99.08	0
98.31	98.04	0
98.37	98.73	0
98.19	98.72	0
98.51	100.07	0
98.23	99.02	0
97.96	98.94	0
97.77	99	0
97.49	98.54	0
97.76	98.42	0
98.01	97.9	0
97.73	97.46	0
97.06	97	0
96.63	95.97	0
96.58	96.55	0
96.66	96.51	0
96.77	96.76	0
96.5	96.05	0
96.53	96.47	0
96.22	96.38	0
96.49	97.27	0
96.34	96.67	0
96.31	96.59	0
96.06	96.06	0
95.9	96.92	0
95.33	94.96	0
95.53	95.59	0
95.42	95.68	0
95.57	95.35	0
95.3	95.41	0
95.31	95.32	0
95.38	95.8	0
95.22	95.46	0
94.62	94.16	0
93.81	92.49	0
93.6	91.58	0
93.2	91.5	0
93.29	90.83	0
93.54	91.28	0
93.23	90.57	0
93.46	90.93	0
92.82	90.9	0
92.85	91.49	0
92.67	91.38	0
92.32	90.91	0
92.06	90.72	0
91.88	89.53	0
91.53	89.47	0
91.19	89.28	0




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5766&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5766&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5766&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
X[t] = + 103.956799581520 + 0.0277436224976786Y[t] -0.240428121994219D[t] + 0.22261655503733M1[t] + 0.209856411820287M2[t] + 0.312980163669945M3[t] + 0.173878277769374M4[t] + 0.23901190526251M5[t] + 0.219510100701345M6[t] + 0.0881549112736562M7[t] + 0.00085529777875065M8[t] -0.287562219485007M9[t] -0.323491449137164M10[t] -0.0810561418639918M11[t] -0.160791001539852t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
X[t] =  +  103.956799581520 +  0.0277436224976786Y[t] -0.240428121994219D[t] +  0.22261655503733M1[t] +  0.209856411820287M2[t] +  0.312980163669945M3[t] +  0.173878277769374M4[t] +  0.23901190526251M5[t] +  0.219510100701345M6[t] +  0.0881549112736562M7[t] +  0.00085529777875065M8[t] -0.287562219485007M9[t] -0.323491449137164M10[t] -0.0810561418639918M11[t] -0.160791001539852t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5766&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]X[t] =  +  103.956799581520 +  0.0277436224976786Y[t] -0.240428121994219D[t] +  0.22261655503733M1[t] +  0.209856411820287M2[t] +  0.312980163669945M3[t] +  0.173878277769374M4[t] +  0.23901190526251M5[t] +  0.219510100701345M6[t] +  0.0881549112736562M7[t] +  0.00085529777875065M8[t] -0.287562219485007M9[t] -0.323491449137164M10[t] -0.0810561418639918M11[t] -0.160791001539852t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5766&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5766&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
X[t] = + 103.956799581520 + 0.0277436224976786Y[t] -0.240428121994219D[t] + 0.22261655503733M1[t] + 0.209856411820287M2[t] + 0.312980163669945M3[t] + 0.173878277769374M4[t] + 0.23901190526251M5[t] + 0.219510100701345M6[t] + 0.0881549112736562M7[t] + 0.00085529777875065M8[t] -0.287562219485007M9[t] -0.323491449137164M10[t] -0.0810561418639918M11[t] -0.160791001539852t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)103.9567995815205.96331817.432700
Y0.02774362249767860.0558050.49720.6204770.310239
D-0.2404281219942190.209577-1.14720.2548020.127401
M10.222616555037330.2382320.93450.3529550.176477
M20.2098564118202870.2376810.88290.3799850.189992
M30.3129801636699450.2404661.30160.1968980.098449
M40.1738782777693740.2435620.71390.4774210.23871
M50.239011905262510.2472360.96670.3366640.168332
M60.2195101007013450.2435370.90130.3701820.185091
M70.08815491127365620.2379040.37050.7119780.355989
M80.000855297778750650.2356640.00360.9971130.498557
M9-0.2875622194850070.235656-1.22030.2260420.113021
M10-0.3234914491371640.244816-1.32140.1902430.095122
M11-0.08105614186399180.24347-0.33290.7400870.370044
t-0.1607910015398520.010155-15.834400

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 103.956799581520 & 5.963318 & 17.4327 & 0 & 0 \tabularnewline
Y & 0.0277436224976786 & 0.055805 & 0.4972 & 0.620477 & 0.310239 \tabularnewline
D & -0.240428121994219 & 0.209577 & -1.1472 & 0.254802 & 0.127401 \tabularnewline
M1 & 0.22261655503733 & 0.238232 & 0.9345 & 0.352955 & 0.176477 \tabularnewline
M2 & 0.209856411820287 & 0.237681 & 0.8829 & 0.379985 & 0.189992 \tabularnewline
M3 & 0.312980163669945 & 0.240466 & 1.3016 & 0.196898 & 0.098449 \tabularnewline
M4 & 0.173878277769374 & 0.243562 & 0.7139 & 0.477421 & 0.23871 \tabularnewline
M5 & 0.23901190526251 & 0.247236 & 0.9667 & 0.336664 & 0.168332 \tabularnewline
M6 & 0.219510100701345 & 0.243537 & 0.9013 & 0.370182 & 0.185091 \tabularnewline
M7 & 0.0881549112736562 & 0.237904 & 0.3705 & 0.711978 & 0.355989 \tabularnewline
M8 & 0.00085529777875065 & 0.235664 & 0.0036 & 0.997113 & 0.498557 \tabularnewline
M9 & -0.287562219485007 & 0.235656 & -1.2203 & 0.226042 & 0.113021 \tabularnewline
M10 & -0.323491449137164 & 0.244816 & -1.3214 & 0.190243 & 0.095122 \tabularnewline
M11 & -0.0810561418639918 & 0.24347 & -0.3329 & 0.740087 & 0.370044 \tabularnewline
t & -0.160791001539852 & 0.010155 & -15.8344 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5766&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]103.956799581520[/C][C]5.963318[/C][C]17.4327[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]Y[/C][C]0.0277436224976786[/C][C]0.055805[/C][C]0.4972[/C][C]0.620477[/C][C]0.310239[/C][/ROW]
[ROW][C]D[/C][C]-0.240428121994219[/C][C]0.209577[/C][C]-1.1472[/C][C]0.254802[/C][C]0.127401[/C][/ROW]
[ROW][C]M1[/C][C]0.22261655503733[/C][C]0.238232[/C][C]0.9345[/C][C]0.352955[/C][C]0.176477[/C][/ROW]
[ROW][C]M2[/C][C]0.209856411820287[/C][C]0.237681[/C][C]0.8829[/C][C]0.379985[/C][C]0.189992[/C][/ROW]
[ROW][C]M3[/C][C]0.312980163669945[/C][C]0.240466[/C][C]1.3016[/C][C]0.196898[/C][C]0.098449[/C][/ROW]
[ROW][C]M4[/C][C]0.173878277769374[/C][C]0.243562[/C][C]0.7139[/C][C]0.477421[/C][C]0.23871[/C][/ROW]
[ROW][C]M5[/C][C]0.23901190526251[/C][C]0.247236[/C][C]0.9667[/C][C]0.336664[/C][C]0.168332[/C][/ROW]
[ROW][C]M6[/C][C]0.219510100701345[/C][C]0.243537[/C][C]0.9013[/C][C]0.370182[/C][C]0.185091[/C][/ROW]
[ROW][C]M7[/C][C]0.0881549112736562[/C][C]0.237904[/C][C]0.3705[/C][C]0.711978[/C][C]0.355989[/C][/ROW]
[ROW][C]M8[/C][C]0.00085529777875065[/C][C]0.235664[/C][C]0.0036[/C][C]0.997113[/C][C]0.498557[/C][/ROW]
[ROW][C]M9[/C][C]-0.287562219485007[/C][C]0.235656[/C][C]-1.2203[/C][C]0.226042[/C][C]0.113021[/C][/ROW]
[ROW][C]M10[/C][C]-0.323491449137164[/C][C]0.244816[/C][C]-1.3214[/C][C]0.190243[/C][C]0.095122[/C][/ROW]
[ROW][C]M11[/C][C]-0.0810561418639918[/C][C]0.24347[/C][C]-0.3329[/C][C]0.740087[/C][C]0.370044[/C][/ROW]
[ROW][C]t[/C][C]-0.160791001539852[/C][C]0.010155[/C][C]-15.8344[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5766&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5766&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)103.9567995815205.96331817.432700
Y0.02774362249767860.0558050.49720.6204770.310239
D-0.2404281219942190.209577-1.14720.2548020.127401
M10.222616555037330.2382320.93450.3529550.176477
M20.2098564118202870.2376810.88290.3799850.189992
M30.3129801636699450.2404661.30160.1968980.098449
M40.1738782777693740.2435620.71390.4774210.23871
M50.239011905262510.2472360.96670.3366640.168332
M60.2195101007013450.2435370.90130.3701820.185091
M70.08815491127365620.2379040.37050.7119780.355989
M80.000855297778750650.2356640.00360.9971130.498557
M9-0.2875622194850070.235656-1.22030.2260420.113021
M10-0.3234914491371640.244816-1.32140.1902430.095122
M11-0.08105614186399180.24347-0.33290.7400870.370044
t-0.1607910015398520.010155-15.834400







Multiple Linear Regression - Regression Statistics
Multiple R0.995616957055275
R-squared0.991253125176006
Adjusted R-squared0.98968317328452
F-TEST (value)631.390764616169
F-TEST (DF numerator)14
F-TEST (DF denominator)78
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.455035471598093
Sum Squared Residuals16.1504678721749

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.995616957055275 \tabularnewline
R-squared & 0.991253125176006 \tabularnewline
Adjusted R-squared & 0.98968317328452 \tabularnewline
F-TEST (value) & 631.390764616169 \tabularnewline
F-TEST (DF numerator) & 14 \tabularnewline
F-TEST (DF denominator) & 78 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.455035471598093 \tabularnewline
Sum Squared Residuals & 16.1504678721749 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5766&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.995616957055275[/C][/ROW]
[ROW][C]R-squared[/C][C]0.991253125176006[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.98968317328452[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]631.390764616169[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]14[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]78[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.455035471598093[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]16.1504678721749[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5766&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5766&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.995616957055275
R-squared0.991253125176006
Adjusted R-squared0.98968317328452
F-TEST (value)631.390764616169
F-TEST (DF numerator)14
F-TEST (DF denominator)78
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.455035471598093
Sum Squared Residuals16.1504678721749







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1106.54106.771456444297-0.231456444297298
2106.44106.580426817367-0.140426817366874
3106.57106.5366313789260.03336862107448
4106.12106.224531297586-0.104531297586109
5106.13106.1241575077150.00584249228520312
6106.26105.9685565256370.291443474363295
7105.78105.6342400284730.145759971527303
8105.77105.6293518976820.140648102318069
9105.2105.1759818355040.0240181644963368
10105.15104.9512405455890.198759454411002
11105.01105.051473078396-0.041473078395764
12104.75104.944549468672-0.194549468672183
13104.96105.013033491569-0.0530334915691116
14105.26104.8167325763640.443267423635891
15105.13104.7512971123750.378702887625423
16104.77104.4458555004350.324144499565384
17104.79104.3618504478370.428149552163084
18104.4104.1504847845390.249515215461501
19103.89103.8394729302730.050527069727458
20103.93103.5888853892130.341114610787012
21103.48103.1208112071110.359188792889041
22103.45102.9160453253950.533954674605375
23103.47103.0035157918520.466484208147537
24103.5102.9074121949030.59258780509703
25103.69102.9675731310510.722426868949408
26103.57102.8201009914420.749899008558481
27103.47102.7602142519520.709785748048495
28102.85102.4805742089340.369425791065608
29102.54102.4012855721610.138714427838704
30102.39102.2021271027620.187872897238138
31102.16101.9083162944440.251683705555534
32101.51101.617777936988-0.107777936988252
33100.83101.145542211512-0.315542211511576
34100.55100.934950169071-0.384950169070729
35100.88101.025472434003-0.145472434003309
36101100.9360273064530.0639726935467434
37100.51101.003401584450-0.493401584450265
38100.44100.830405312143-0.390405312143332
39100.32100.786887309927-0.466887309926958
4099.98100.494207764336-0.514207764335922
41100.03100.401047316314-0.371047316314000
4299.64100.209657061214-0.569657061213912
4399.1199.8933739186734-0.783373918673393
4498.9799.6355730357645-0.665573035764449
4598.699.1824804098112-0.582480409811169
4698.3198.9569068112216-0.646906811221567
4798.3799.0576942164783-0.687694216478283
4898.1998.9776819205775-0.787681920577453
4998.5199.0769613644468-0.56696136444679
5098.2398.8742794160673-0.644279416067335
5197.9698.8143926765773-0.854392676577336
5297.7798.5161644064868-0.746164406486772
5397.4998.4077449660911-0.917744966091126
5497.7698.2241229252904-0.464122925290377
5598.0197.9175500506240.092449949375956
5697.7397.65725224169030.0727477583096912
5797.0697.1952816565378-0.135281656537769
5896.6396.9699854941732-0.339985494173159
5996.5897.0677211009551-0.48772110095513
6096.6696.9868764963794-0.326876496379365
6196.7797.0556379555013-0.285637955501264
6296.596.862388838771-0.362388838771014
6396.5396.8163739105298-0.286373910529843
6496.2296.5139840970646-0.293984097064631
6596.4996.44301854704090.0469814529591465
6696.3496.24607956744120.0939204325587787
6796.3195.9517138866740.358286113326132
6896.0695.68891915171530.371080848284658
6995.995.26357014825970.636429851740269
7095.3395.01247241697230.31752758302772
7195.5395.11159520487910.418404795120864
7295.4295.0343572712280.385642728771934
7395.5795.08702742930130.482972570698681
7495.394.91514090189430.384859098105718
7595.3194.85497672617930.455023273820709
7695.3894.56840077753780.811599222462239
7795.2294.46331057184180.756689428158168
7894.6294.24695105649380.373048943506173
7993.8193.9084730159552-0.098473015955166
8093.693.6351357044475-0.0351357044475297
8193.293.18370769584410.0162923041559023
8293.2992.96839923757870.321600762421359
8393.5493.0625281734360.477471826564084
8493.2392.96309534178670.266904658213293
8593.4693.03490859938340.42509140061664
8692.8292.8605251459515-0.0405251459515358
8792.8592.8192266335350.0307733664650291
8892.6792.51628194761980.153718052380204
8992.3292.4075850709992-0.0875850709991806
9092.0692.2220209766236-0.162020976623596
9191.8891.8968598748838-0.016859874883824
9291.5391.6471046424992-0.117104642499200
9391.1991.192624835421-0.0026248354210355

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 106.54 & 106.771456444297 & -0.231456444297298 \tabularnewline
2 & 106.44 & 106.580426817367 & -0.140426817366874 \tabularnewline
3 & 106.57 & 106.536631378926 & 0.03336862107448 \tabularnewline
4 & 106.12 & 106.224531297586 & -0.104531297586109 \tabularnewline
5 & 106.13 & 106.124157507715 & 0.00584249228520312 \tabularnewline
6 & 106.26 & 105.968556525637 & 0.291443474363295 \tabularnewline
7 & 105.78 & 105.634240028473 & 0.145759971527303 \tabularnewline
8 & 105.77 & 105.629351897682 & 0.140648102318069 \tabularnewline
9 & 105.2 & 105.175981835504 & 0.0240181644963368 \tabularnewline
10 & 105.15 & 104.951240545589 & 0.198759454411002 \tabularnewline
11 & 105.01 & 105.051473078396 & -0.041473078395764 \tabularnewline
12 & 104.75 & 104.944549468672 & -0.194549468672183 \tabularnewline
13 & 104.96 & 105.013033491569 & -0.0530334915691116 \tabularnewline
14 & 105.26 & 104.816732576364 & 0.443267423635891 \tabularnewline
15 & 105.13 & 104.751297112375 & 0.378702887625423 \tabularnewline
16 & 104.77 & 104.445855500435 & 0.324144499565384 \tabularnewline
17 & 104.79 & 104.361850447837 & 0.428149552163084 \tabularnewline
18 & 104.4 & 104.150484784539 & 0.249515215461501 \tabularnewline
19 & 103.89 & 103.839472930273 & 0.050527069727458 \tabularnewline
20 & 103.93 & 103.588885389213 & 0.341114610787012 \tabularnewline
21 & 103.48 & 103.120811207111 & 0.359188792889041 \tabularnewline
22 & 103.45 & 102.916045325395 & 0.533954674605375 \tabularnewline
23 & 103.47 & 103.003515791852 & 0.466484208147537 \tabularnewline
24 & 103.5 & 102.907412194903 & 0.59258780509703 \tabularnewline
25 & 103.69 & 102.967573131051 & 0.722426868949408 \tabularnewline
26 & 103.57 & 102.820100991442 & 0.749899008558481 \tabularnewline
27 & 103.47 & 102.760214251952 & 0.709785748048495 \tabularnewline
28 & 102.85 & 102.480574208934 & 0.369425791065608 \tabularnewline
29 & 102.54 & 102.401285572161 & 0.138714427838704 \tabularnewline
30 & 102.39 & 102.202127102762 & 0.187872897238138 \tabularnewline
31 & 102.16 & 101.908316294444 & 0.251683705555534 \tabularnewline
32 & 101.51 & 101.617777936988 & -0.107777936988252 \tabularnewline
33 & 100.83 & 101.145542211512 & -0.315542211511576 \tabularnewline
34 & 100.55 & 100.934950169071 & -0.384950169070729 \tabularnewline
35 & 100.88 & 101.025472434003 & -0.145472434003309 \tabularnewline
36 & 101 & 100.936027306453 & 0.0639726935467434 \tabularnewline
37 & 100.51 & 101.003401584450 & -0.493401584450265 \tabularnewline
38 & 100.44 & 100.830405312143 & -0.390405312143332 \tabularnewline
39 & 100.32 & 100.786887309927 & -0.466887309926958 \tabularnewline
40 & 99.98 & 100.494207764336 & -0.514207764335922 \tabularnewline
41 & 100.03 & 100.401047316314 & -0.371047316314000 \tabularnewline
42 & 99.64 & 100.209657061214 & -0.569657061213912 \tabularnewline
43 & 99.11 & 99.8933739186734 & -0.783373918673393 \tabularnewline
44 & 98.97 & 99.6355730357645 & -0.665573035764449 \tabularnewline
45 & 98.6 & 99.1824804098112 & -0.582480409811169 \tabularnewline
46 & 98.31 & 98.9569068112216 & -0.646906811221567 \tabularnewline
47 & 98.37 & 99.0576942164783 & -0.687694216478283 \tabularnewline
48 & 98.19 & 98.9776819205775 & -0.787681920577453 \tabularnewline
49 & 98.51 & 99.0769613644468 & -0.56696136444679 \tabularnewline
50 & 98.23 & 98.8742794160673 & -0.644279416067335 \tabularnewline
51 & 97.96 & 98.8143926765773 & -0.854392676577336 \tabularnewline
52 & 97.77 & 98.5161644064868 & -0.746164406486772 \tabularnewline
53 & 97.49 & 98.4077449660911 & -0.917744966091126 \tabularnewline
54 & 97.76 & 98.2241229252904 & -0.464122925290377 \tabularnewline
55 & 98.01 & 97.917550050624 & 0.092449949375956 \tabularnewline
56 & 97.73 & 97.6572522416903 & 0.0727477583096912 \tabularnewline
57 & 97.06 & 97.1952816565378 & -0.135281656537769 \tabularnewline
58 & 96.63 & 96.9699854941732 & -0.339985494173159 \tabularnewline
59 & 96.58 & 97.0677211009551 & -0.48772110095513 \tabularnewline
60 & 96.66 & 96.9868764963794 & -0.326876496379365 \tabularnewline
61 & 96.77 & 97.0556379555013 & -0.285637955501264 \tabularnewline
62 & 96.5 & 96.862388838771 & -0.362388838771014 \tabularnewline
63 & 96.53 & 96.8163739105298 & -0.286373910529843 \tabularnewline
64 & 96.22 & 96.5139840970646 & -0.293984097064631 \tabularnewline
65 & 96.49 & 96.4430185470409 & 0.0469814529591465 \tabularnewline
66 & 96.34 & 96.2460795674412 & 0.0939204325587787 \tabularnewline
67 & 96.31 & 95.951713886674 & 0.358286113326132 \tabularnewline
68 & 96.06 & 95.6889191517153 & 0.371080848284658 \tabularnewline
69 & 95.9 & 95.2635701482597 & 0.636429851740269 \tabularnewline
70 & 95.33 & 95.0124724169723 & 0.31752758302772 \tabularnewline
71 & 95.53 & 95.1115952048791 & 0.418404795120864 \tabularnewline
72 & 95.42 & 95.034357271228 & 0.385642728771934 \tabularnewline
73 & 95.57 & 95.0870274293013 & 0.482972570698681 \tabularnewline
74 & 95.3 & 94.9151409018943 & 0.384859098105718 \tabularnewline
75 & 95.31 & 94.8549767261793 & 0.455023273820709 \tabularnewline
76 & 95.38 & 94.5684007775378 & 0.811599222462239 \tabularnewline
77 & 95.22 & 94.4633105718418 & 0.756689428158168 \tabularnewline
78 & 94.62 & 94.2469510564938 & 0.373048943506173 \tabularnewline
79 & 93.81 & 93.9084730159552 & -0.098473015955166 \tabularnewline
80 & 93.6 & 93.6351357044475 & -0.0351357044475297 \tabularnewline
81 & 93.2 & 93.1837076958441 & 0.0162923041559023 \tabularnewline
82 & 93.29 & 92.9683992375787 & 0.321600762421359 \tabularnewline
83 & 93.54 & 93.062528173436 & 0.477471826564084 \tabularnewline
84 & 93.23 & 92.9630953417867 & 0.266904658213293 \tabularnewline
85 & 93.46 & 93.0349085993834 & 0.42509140061664 \tabularnewline
86 & 92.82 & 92.8605251459515 & -0.0405251459515358 \tabularnewline
87 & 92.85 & 92.819226633535 & 0.0307733664650291 \tabularnewline
88 & 92.67 & 92.5162819476198 & 0.153718052380204 \tabularnewline
89 & 92.32 & 92.4075850709992 & -0.0875850709991806 \tabularnewline
90 & 92.06 & 92.2220209766236 & -0.162020976623596 \tabularnewline
91 & 91.88 & 91.8968598748838 & -0.016859874883824 \tabularnewline
92 & 91.53 & 91.6471046424992 & -0.117104642499200 \tabularnewline
93 & 91.19 & 91.192624835421 & -0.0026248354210355 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5766&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]106.54[/C][C]106.771456444297[/C][C]-0.231456444297298[/C][/ROW]
[ROW][C]2[/C][C]106.44[/C][C]106.580426817367[/C][C]-0.140426817366874[/C][/ROW]
[ROW][C]3[/C][C]106.57[/C][C]106.536631378926[/C][C]0.03336862107448[/C][/ROW]
[ROW][C]4[/C][C]106.12[/C][C]106.224531297586[/C][C]-0.104531297586109[/C][/ROW]
[ROW][C]5[/C][C]106.13[/C][C]106.124157507715[/C][C]0.00584249228520312[/C][/ROW]
[ROW][C]6[/C][C]106.26[/C][C]105.968556525637[/C][C]0.291443474363295[/C][/ROW]
[ROW][C]7[/C][C]105.78[/C][C]105.634240028473[/C][C]0.145759971527303[/C][/ROW]
[ROW][C]8[/C][C]105.77[/C][C]105.629351897682[/C][C]0.140648102318069[/C][/ROW]
[ROW][C]9[/C][C]105.2[/C][C]105.175981835504[/C][C]0.0240181644963368[/C][/ROW]
[ROW][C]10[/C][C]105.15[/C][C]104.951240545589[/C][C]0.198759454411002[/C][/ROW]
[ROW][C]11[/C][C]105.01[/C][C]105.051473078396[/C][C]-0.041473078395764[/C][/ROW]
[ROW][C]12[/C][C]104.75[/C][C]104.944549468672[/C][C]-0.194549468672183[/C][/ROW]
[ROW][C]13[/C][C]104.96[/C][C]105.013033491569[/C][C]-0.0530334915691116[/C][/ROW]
[ROW][C]14[/C][C]105.26[/C][C]104.816732576364[/C][C]0.443267423635891[/C][/ROW]
[ROW][C]15[/C][C]105.13[/C][C]104.751297112375[/C][C]0.378702887625423[/C][/ROW]
[ROW][C]16[/C][C]104.77[/C][C]104.445855500435[/C][C]0.324144499565384[/C][/ROW]
[ROW][C]17[/C][C]104.79[/C][C]104.361850447837[/C][C]0.428149552163084[/C][/ROW]
[ROW][C]18[/C][C]104.4[/C][C]104.150484784539[/C][C]0.249515215461501[/C][/ROW]
[ROW][C]19[/C][C]103.89[/C][C]103.839472930273[/C][C]0.050527069727458[/C][/ROW]
[ROW][C]20[/C][C]103.93[/C][C]103.588885389213[/C][C]0.341114610787012[/C][/ROW]
[ROW][C]21[/C][C]103.48[/C][C]103.120811207111[/C][C]0.359188792889041[/C][/ROW]
[ROW][C]22[/C][C]103.45[/C][C]102.916045325395[/C][C]0.533954674605375[/C][/ROW]
[ROW][C]23[/C][C]103.47[/C][C]103.003515791852[/C][C]0.466484208147537[/C][/ROW]
[ROW][C]24[/C][C]103.5[/C][C]102.907412194903[/C][C]0.59258780509703[/C][/ROW]
[ROW][C]25[/C][C]103.69[/C][C]102.967573131051[/C][C]0.722426868949408[/C][/ROW]
[ROW][C]26[/C][C]103.57[/C][C]102.820100991442[/C][C]0.749899008558481[/C][/ROW]
[ROW][C]27[/C][C]103.47[/C][C]102.760214251952[/C][C]0.709785748048495[/C][/ROW]
[ROW][C]28[/C][C]102.85[/C][C]102.480574208934[/C][C]0.369425791065608[/C][/ROW]
[ROW][C]29[/C][C]102.54[/C][C]102.401285572161[/C][C]0.138714427838704[/C][/ROW]
[ROW][C]30[/C][C]102.39[/C][C]102.202127102762[/C][C]0.187872897238138[/C][/ROW]
[ROW][C]31[/C][C]102.16[/C][C]101.908316294444[/C][C]0.251683705555534[/C][/ROW]
[ROW][C]32[/C][C]101.51[/C][C]101.617777936988[/C][C]-0.107777936988252[/C][/ROW]
[ROW][C]33[/C][C]100.83[/C][C]101.145542211512[/C][C]-0.315542211511576[/C][/ROW]
[ROW][C]34[/C][C]100.55[/C][C]100.934950169071[/C][C]-0.384950169070729[/C][/ROW]
[ROW][C]35[/C][C]100.88[/C][C]101.025472434003[/C][C]-0.145472434003309[/C][/ROW]
[ROW][C]36[/C][C]101[/C][C]100.936027306453[/C][C]0.0639726935467434[/C][/ROW]
[ROW][C]37[/C][C]100.51[/C][C]101.003401584450[/C][C]-0.493401584450265[/C][/ROW]
[ROW][C]38[/C][C]100.44[/C][C]100.830405312143[/C][C]-0.390405312143332[/C][/ROW]
[ROW][C]39[/C][C]100.32[/C][C]100.786887309927[/C][C]-0.466887309926958[/C][/ROW]
[ROW][C]40[/C][C]99.98[/C][C]100.494207764336[/C][C]-0.514207764335922[/C][/ROW]
[ROW][C]41[/C][C]100.03[/C][C]100.401047316314[/C][C]-0.371047316314000[/C][/ROW]
[ROW][C]42[/C][C]99.64[/C][C]100.209657061214[/C][C]-0.569657061213912[/C][/ROW]
[ROW][C]43[/C][C]99.11[/C][C]99.8933739186734[/C][C]-0.783373918673393[/C][/ROW]
[ROW][C]44[/C][C]98.97[/C][C]99.6355730357645[/C][C]-0.665573035764449[/C][/ROW]
[ROW][C]45[/C][C]98.6[/C][C]99.1824804098112[/C][C]-0.582480409811169[/C][/ROW]
[ROW][C]46[/C][C]98.31[/C][C]98.9569068112216[/C][C]-0.646906811221567[/C][/ROW]
[ROW][C]47[/C][C]98.37[/C][C]99.0576942164783[/C][C]-0.687694216478283[/C][/ROW]
[ROW][C]48[/C][C]98.19[/C][C]98.9776819205775[/C][C]-0.787681920577453[/C][/ROW]
[ROW][C]49[/C][C]98.51[/C][C]99.0769613644468[/C][C]-0.56696136444679[/C][/ROW]
[ROW][C]50[/C][C]98.23[/C][C]98.8742794160673[/C][C]-0.644279416067335[/C][/ROW]
[ROW][C]51[/C][C]97.96[/C][C]98.8143926765773[/C][C]-0.854392676577336[/C][/ROW]
[ROW][C]52[/C][C]97.77[/C][C]98.5161644064868[/C][C]-0.746164406486772[/C][/ROW]
[ROW][C]53[/C][C]97.49[/C][C]98.4077449660911[/C][C]-0.917744966091126[/C][/ROW]
[ROW][C]54[/C][C]97.76[/C][C]98.2241229252904[/C][C]-0.464122925290377[/C][/ROW]
[ROW][C]55[/C][C]98.01[/C][C]97.917550050624[/C][C]0.092449949375956[/C][/ROW]
[ROW][C]56[/C][C]97.73[/C][C]97.6572522416903[/C][C]0.0727477583096912[/C][/ROW]
[ROW][C]57[/C][C]97.06[/C][C]97.1952816565378[/C][C]-0.135281656537769[/C][/ROW]
[ROW][C]58[/C][C]96.63[/C][C]96.9699854941732[/C][C]-0.339985494173159[/C][/ROW]
[ROW][C]59[/C][C]96.58[/C][C]97.0677211009551[/C][C]-0.48772110095513[/C][/ROW]
[ROW][C]60[/C][C]96.66[/C][C]96.9868764963794[/C][C]-0.326876496379365[/C][/ROW]
[ROW][C]61[/C][C]96.77[/C][C]97.0556379555013[/C][C]-0.285637955501264[/C][/ROW]
[ROW][C]62[/C][C]96.5[/C][C]96.862388838771[/C][C]-0.362388838771014[/C][/ROW]
[ROW][C]63[/C][C]96.53[/C][C]96.8163739105298[/C][C]-0.286373910529843[/C][/ROW]
[ROW][C]64[/C][C]96.22[/C][C]96.5139840970646[/C][C]-0.293984097064631[/C][/ROW]
[ROW][C]65[/C][C]96.49[/C][C]96.4430185470409[/C][C]0.0469814529591465[/C][/ROW]
[ROW][C]66[/C][C]96.34[/C][C]96.2460795674412[/C][C]0.0939204325587787[/C][/ROW]
[ROW][C]67[/C][C]96.31[/C][C]95.951713886674[/C][C]0.358286113326132[/C][/ROW]
[ROW][C]68[/C][C]96.06[/C][C]95.6889191517153[/C][C]0.371080848284658[/C][/ROW]
[ROW][C]69[/C][C]95.9[/C][C]95.2635701482597[/C][C]0.636429851740269[/C][/ROW]
[ROW][C]70[/C][C]95.33[/C][C]95.0124724169723[/C][C]0.31752758302772[/C][/ROW]
[ROW][C]71[/C][C]95.53[/C][C]95.1115952048791[/C][C]0.418404795120864[/C][/ROW]
[ROW][C]72[/C][C]95.42[/C][C]95.034357271228[/C][C]0.385642728771934[/C][/ROW]
[ROW][C]73[/C][C]95.57[/C][C]95.0870274293013[/C][C]0.482972570698681[/C][/ROW]
[ROW][C]74[/C][C]95.3[/C][C]94.9151409018943[/C][C]0.384859098105718[/C][/ROW]
[ROW][C]75[/C][C]95.31[/C][C]94.8549767261793[/C][C]0.455023273820709[/C][/ROW]
[ROW][C]76[/C][C]95.38[/C][C]94.5684007775378[/C][C]0.811599222462239[/C][/ROW]
[ROW][C]77[/C][C]95.22[/C][C]94.4633105718418[/C][C]0.756689428158168[/C][/ROW]
[ROW][C]78[/C][C]94.62[/C][C]94.2469510564938[/C][C]0.373048943506173[/C][/ROW]
[ROW][C]79[/C][C]93.81[/C][C]93.9084730159552[/C][C]-0.098473015955166[/C][/ROW]
[ROW][C]80[/C][C]93.6[/C][C]93.6351357044475[/C][C]-0.0351357044475297[/C][/ROW]
[ROW][C]81[/C][C]93.2[/C][C]93.1837076958441[/C][C]0.0162923041559023[/C][/ROW]
[ROW][C]82[/C][C]93.29[/C][C]92.9683992375787[/C][C]0.321600762421359[/C][/ROW]
[ROW][C]83[/C][C]93.54[/C][C]93.062528173436[/C][C]0.477471826564084[/C][/ROW]
[ROW][C]84[/C][C]93.23[/C][C]92.9630953417867[/C][C]0.266904658213293[/C][/ROW]
[ROW][C]85[/C][C]93.46[/C][C]93.0349085993834[/C][C]0.42509140061664[/C][/ROW]
[ROW][C]86[/C][C]92.82[/C][C]92.8605251459515[/C][C]-0.0405251459515358[/C][/ROW]
[ROW][C]87[/C][C]92.85[/C][C]92.819226633535[/C][C]0.0307733664650291[/C][/ROW]
[ROW][C]88[/C][C]92.67[/C][C]92.5162819476198[/C][C]0.153718052380204[/C][/ROW]
[ROW][C]89[/C][C]92.32[/C][C]92.4075850709992[/C][C]-0.0875850709991806[/C][/ROW]
[ROW][C]90[/C][C]92.06[/C][C]92.2220209766236[/C][C]-0.162020976623596[/C][/ROW]
[ROW][C]91[/C][C]91.88[/C][C]91.8968598748838[/C][C]-0.016859874883824[/C][/ROW]
[ROW][C]92[/C][C]91.53[/C][C]91.6471046424992[/C][C]-0.117104642499200[/C][/ROW]
[ROW][C]93[/C][C]91.19[/C][C]91.192624835421[/C][C]-0.0026248354210355[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5766&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5766&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1106.54106.771456444297-0.231456444297298
2106.44106.580426817367-0.140426817366874
3106.57106.5366313789260.03336862107448
4106.12106.224531297586-0.104531297586109
5106.13106.1241575077150.00584249228520312
6106.26105.9685565256370.291443474363295
7105.78105.6342400284730.145759971527303
8105.77105.6293518976820.140648102318069
9105.2105.1759818355040.0240181644963368
10105.15104.9512405455890.198759454411002
11105.01105.051473078396-0.041473078395764
12104.75104.944549468672-0.194549468672183
13104.96105.013033491569-0.0530334915691116
14105.26104.8167325763640.443267423635891
15105.13104.7512971123750.378702887625423
16104.77104.4458555004350.324144499565384
17104.79104.3618504478370.428149552163084
18104.4104.1504847845390.249515215461501
19103.89103.8394729302730.050527069727458
20103.93103.5888853892130.341114610787012
21103.48103.1208112071110.359188792889041
22103.45102.9160453253950.533954674605375
23103.47103.0035157918520.466484208147537
24103.5102.9074121949030.59258780509703
25103.69102.9675731310510.722426868949408
26103.57102.8201009914420.749899008558481
27103.47102.7602142519520.709785748048495
28102.85102.4805742089340.369425791065608
29102.54102.4012855721610.138714427838704
30102.39102.2021271027620.187872897238138
31102.16101.9083162944440.251683705555534
32101.51101.617777936988-0.107777936988252
33100.83101.145542211512-0.315542211511576
34100.55100.934950169071-0.384950169070729
35100.88101.025472434003-0.145472434003309
36101100.9360273064530.0639726935467434
37100.51101.003401584450-0.493401584450265
38100.44100.830405312143-0.390405312143332
39100.32100.786887309927-0.466887309926958
4099.98100.494207764336-0.514207764335922
41100.03100.401047316314-0.371047316314000
4299.64100.209657061214-0.569657061213912
4399.1199.8933739186734-0.783373918673393
4498.9799.6355730357645-0.665573035764449
4598.699.1824804098112-0.582480409811169
4698.3198.9569068112216-0.646906811221567
4798.3799.0576942164783-0.687694216478283
4898.1998.9776819205775-0.787681920577453
4998.5199.0769613644468-0.56696136444679
5098.2398.8742794160673-0.644279416067335
5197.9698.8143926765773-0.854392676577336
5297.7798.5161644064868-0.746164406486772
5397.4998.4077449660911-0.917744966091126
5497.7698.2241229252904-0.464122925290377
5598.0197.9175500506240.092449949375956
5697.7397.65725224169030.0727477583096912
5797.0697.1952816565378-0.135281656537769
5896.6396.9699854941732-0.339985494173159
5996.5897.0677211009551-0.48772110095513
6096.6696.9868764963794-0.326876496379365
6196.7797.0556379555013-0.285637955501264
6296.596.862388838771-0.362388838771014
6396.5396.8163739105298-0.286373910529843
6496.2296.5139840970646-0.293984097064631
6596.4996.44301854704090.0469814529591465
6696.3496.24607956744120.0939204325587787
6796.3195.9517138866740.358286113326132
6896.0695.68891915171530.371080848284658
6995.995.26357014825970.636429851740269
7095.3395.01247241697230.31752758302772
7195.5395.11159520487910.418404795120864
7295.4295.0343572712280.385642728771934
7395.5795.08702742930130.482972570698681
7495.394.91514090189430.384859098105718
7595.3194.85497672617930.455023273820709
7695.3894.56840077753780.811599222462239
7795.2294.46331057184180.756689428158168
7894.6294.24695105649380.373048943506173
7993.8193.9084730159552-0.098473015955166
8093.693.6351357044475-0.0351357044475297
8193.293.18370769584410.0162923041559023
8293.2992.96839923757870.321600762421359
8393.5493.0625281734360.477471826564084
8493.2392.96309534178670.266904658213293
8593.4693.03490859938340.42509140061664
8692.8292.8605251459515-0.0405251459515358
8792.8592.8192266335350.0307733664650291
8892.6792.51628194761980.153718052380204
8992.3292.4075850709992-0.0875850709991806
9092.0692.2220209766236-0.162020976623596
9191.8891.8968598748838-0.016859874883824
9291.5391.6471046424992-0.117104642499200
9391.1991.192624835421-0.0026248354210355



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')