Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 22 Nov 2018 10:23:06 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2018/Nov/22/t1542879079ohiwhof6ediiyp0.htm/, Retrieved Fri, 03 May 2024 18:22:54 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=315683, Retrieved Fri, 03 May 2024 18:22:54 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact128
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2018-11-22 09:23:06] [31e4af57f5325aef593e09d831d2befc] [Current]
Feedback Forum

Post a new message
Dataseries X:
14 4 5 5 4 5 4 4 4 5 5 4 1 5 3 4 5
19 5 5 5 4 5 NA 4 4 3 3 2 5 2 2 5 2
17 5 5 4 4 4 3 3 2 5 5 3 1 3 3 4 2
17 3 4 4 4 4 3 3 3 5 4 2 2 3 3 4 2
15 5 5 5 4 5 4 4 3 5 4 2 1 3 2 4 4
20 5 5 5 4 5 3 4 3 5 5 3 4 4 4 5 4
15 5 4 5 5 5 4 2 3 5 3 3 1 4 3 5 NA
19 4 NA 4 4 5 4 2 4 5 5 2 1 2 2 5 3
15 5 5 4 4 5 2 2 4 5 5 2 1 5 4 5 2
15 5 5 5 5 5 1 2 4 5 5 4 2 4 2 5 4
19 4 3 4 3 4 4 3 2 4 5 2 1 2 2 5 2
NA 3 5 4 3 5 4 3 2 2 4 2 4 4 4 4 4
20 4 5 5 4 5 4 5 4 5 4 3 1 3 5 4 3
18 5 5 5 4 5 5 4 5 4 5 2 5 3 5 5 3
15 4 4 4 4 4 4 3 4 5 5 3 2 4 2 5 4
14 5 4 5 4 5 1 4 4 4 5 2 1 2 2 4 3
20 4 5 5 4 3 4 4 2 5 4 2 NA 1 1 4 2
NA NA NA NA NA 5 4 NA NA 5 5 NA NA NA 5 NA NA
16 5 4 4 4 5 2 NA 2 5 5 3 2 2 2 4 2
16 5 4 5 5 5 3 4 5 4 5 2 1 3 4 5 2
16 5 5 5 4 5 3 NA 4 4 5 2 4 5 4 5 2
10 3 5 5 4 NA 2 3 1 3 4 3 1 4 4 4 3
19 4 5 5 4 3 1 3 5 5 5 1 2 5 4 4 2
19 4 4 4 4 4 3 2 3 4 4 2 3 3 3 4 2
16 5 5 5 5 4 2 2 4 5 5 3 1 5 5 5 3
15 3 4 3 3 4 NA 3 4 4 4 2 4 2 2 4 2
18 5 5 4 5 5 4 3 2 5 5 2 2 4 5 5 3
17 4 4 4 3 4 4 3 4 5 4 3 3 4 2 4 2
19 4 5 4 4 5 2 4 2 5 5 5 1 3 3 5 2
17 4 5 4 4 4 3 4 3 5 5 2 4 2 1 4 2
NA 4 3 5 4 5 4 3 4 5 5 5 1 1 1 4 5
19 5 4 5 3 4 4 4 4 5 5 2 1 2 2 3 3
20 5 5 5 4 4 4 3 4 5 5 2 1 5 1 5 4
5 4 4 5 5 4 3 4 4 5 4 4 1 4 4 4 3
19 5 5 5 4 5 4 3 4 5 4 1 3 3 3 4 3
16 5 5 5 5 5 4 3 4 4 4 2 4 2 3 5 3
15 4 4 4 4 5 4 3 5 4 4 2 2 1 2 4 2
16 5 4 4 4 5 4 3 4 5 5 3 4 3 2 5 4
18 4 4 4 4 2 3 2 4 5 5 2 2 3 3 5 3
16 4 5 4 3 4 3 5 3 5 5 3 2 3 1 5 2
15 4 4 4 4 4 4 3 4 5 5 2 1 5 3 4 3
17 4 4 4 4 4 2 1 4 5 5 3 1 2 2 4 4
NA 4 3 4 3 5 3 2 3 5 5 4 1 2 2 4 3
20 5 5 4 3 5 4 2 2 5 5 4 5 1 2 5 4
19 5 4 5 4 5 4 3 5 5 5 3 1 4 4 4 3
7 4 4 4 4 4 3 2 4 5 5 2 1 4 1 4 4
13 4 4 4 4 4 2 3 3 5 4 2 1 2 2 4 3
16 4 NA 4 1 5 3 5 4 NA NA 1 NA 1 5 2 2
16 4 4 4 4 5 3 4 4 4 5 4 1 5 4 4 3
NA 4 4 4 3 5 4 5 4 5 5 4 1 4 4 4 1
18 5 5 5 4 4 3 2 3 5 5 3 2 4 4 5 2
18 4 4 4 4 4 3 4 4 4 4 2 2 4 2 5 3
16 4 5 4 4 5 3 3 4 5 5 2 2 2 2 5 3
17 5 5 5 4 5 3 3 4 3 4 2 2 2 2 4 2
19 4 5 4 4 5 3 2 4 4 3 2 3 3 2 4 3
16 4 5 4 4 4 5 3 5 3 3 3 1 2 1 4 2
19 4 4 4 3 5 4 2 4 5 4 2 NA 3 5 5 2
13 5 4 3 4 5 NA 4 2 5 5 2 2 4 5 5 2
16 4 4 4 4 4 3 NA 4 5 5 3 1 3 3 4 2
13 5 4 4 3 4 4 3 5 5 4 3 3 2 2 5 2
12 4 5 4 4 5 4 1 2 5 5 2 3 2 2 5 2
17 4 5 5 4 5 1 1 3 5 5 2 1 1 2 4 2
17 4 5 5 4 4 4 3 4 5 5 4 1 3 2 5 3
17 5 5 5 3 4 3 NA 3 5 5 4 2 4 5 5 3
16 5 5 5 4 5 3 2 4 4 4 3 1 4 5 5 4
16 4 4 3 3 3 4 3 4 5 5 4 3 4 3 5 3
14 4 2 4 3 3 2 4 4 4 4 4 3 3 3 3 3
16 4 5 5 4 5 4 3 5 5 5 4 NA 5 4 5 4
13 4 4 4 4 4 5 4 3 2 2 4 4 4 1 4 2
16 4 4 4 3 4 4 4 4 4 3 5 4 1 1 3 1
14 4 5 5 4 5 4 3 4 5 5 3 2 1 1 5 3
20 4 5 5 4 5 4 4 4 5 5 4 1 5 5 5 4
12 2 5 4 5 4 NA 4 4 4 3 4 1 5 4 3 4
13 5 5 5 4 5 4 3 4 5 5 2 1 3 1 4 4
18 4 5 4 4 4 2 3 4 2 3 2 3 2 2 4 2
14 5 5 4 3 4 4 5 4 5 4 3 2 4 3 5 2
19 5 5 5 4 4 2 2 4 3 3 4 1 4 2 5 1
18 4 5 5 5 5 5 4 4 4 5 2 1 4 2 5 2
14 5 5 5 5 4 5 3 3 4 4 5 1 4 5 5 2
18 5 5 5 4 4 2 3 3 5 5 1 1 5 5 5 3
19 4 5 5 4 4 4 3 2 5 5 3 1 4 2 5 2
15 4 4 4 4 4 3 4 2 4 4 3 1 4 4 4 3
14 4 4 4 4 4 3 4 2 4 4 2 3 4 4 4 4
17 4 3 4 4 2 3 NA 3 5 5 2 1 2 1 4 2
19 5 5 5 5 4 4 5 4 4 5 1 4 1 1 5 2
13 4 5 4 3 4 4 3 4 4 4 2 2 1 2 4 1
19 4 4 4 4 5 3 4 4 5 5 1 4 5 4 5 4
18 5 5 5 5 4 3 3 4 5 5 2 1 5 5 5 3
20 5 5 5 5 5 4 5 4 5 5 2 1 3 2 5 4
15 4 5 5 4 4 4 4 4 4 4 2 1 2 2 2 2
15 5 4 2 4 4 2 4 4 4 4 2 2 4 3 4 3
15 4 3 4 3 3 3 4 2 4 4 3 5 2 1 5 5
20 4 4 4 4 4 3 4 3 3 3 2 3 3 4 4 3
15 3 4 3 4 2 3 2 2 4 4 1 4 1 1 4 1
19 4 5 5 4 4 4 3 3 5 5 1 1 5 5 5 3
18 5 5 5 5 5 4 4 4 5 5 3 4 4 4 5 3
18 5 5 5 5 3 4 3 5 4 4 2 4 2 1 4 2
15 4 5 5 4 4 4 3 4 5 5 3 2 2 3 5 1
20 5 5 5 5 5 5 5 5 2 2 1 3 1 1 5 3
17 3 4 4 3 2 4 3 3 5 5 2 1 4 2 5 2
12 5 5 5 5 5 3 1 5 5 5 2 1 2 1 5 2
18 4 5 4 4 5 4 3 4 4 4 3 4 3 1 5 3
19 5 5 5 5 5 4 4 5 3 5 2 4 1 3 4 3
20 3 4 4 3 4 2 2 2 5 5 2 1 2 2 5 3
NA 4 4 4 4 4 3 3 3 4 4 3 3 3 2 4 3
17 5 5 5 5 5 3 4 4 5 5 1 1 1 2 5 2
15 5 5 5 4 5 3 4 5 5 5 4 5 5 5 5 NA
16 4 5 4 5 4 4 4 4 5 5 3 2 4 3 4 1
18 4 5 4 4 4 4 4 5 5 5 2 2 1 2 5 4
18 4 5 4 4 5 4 NA 5 5 5 3 1 4 4 5 3
14 5 4 5 5 5 4 4 5 4 5 3 3 1 3 5 2
15 4 4 4 3 5 3 3 4 5 4 3 1 4 2 3 3
12 5 4 5 4 4 3 3 4 5 5 4 1 2 2 5 3
17 4 3 4 4 5 3 3 4 5 3 3 3 3 4 3 3
14 4 4 4 4 4 2 NA 4 4 4 2 1 3 1 4 2
18 4 4 4 4 5 3 4 4 5 5 3 4 3 4 4 3
17 5 5 5 5 4 2 2 4 5 5 2 1 3 3 5 2
17 5 5 4 4 5 4 5 5 2 1 1 5 3 5 4 3
20 5 5 5 5 5 5 2 5 5 5 1 1 2 4 5 2
16 5 5 5 3 4 3 2 5 5 5 2 1 2 3 5 3
14 4 5 4 4 4 3 2 4 5 4 4 4 4 4 5 4
15 5 4 5 5 4 3 3 4 5 4 3 2 2 3 4 3
18 4 5 5 4 5 2 3 4 5 5 2 1 5 5 4 3
20 5 5 5 4 5 3 4 5 5 5 2 4 1 1 5 2
17 5 4 3 5 4 3 NA 4 5 5 3 1 3 2 4 3
17 5 5 4 4 4 3 4 4 5 5 3 1 3 4 5 2
17 4 5 4 4 5 4 3 4 4 5 3 2 3 4 5 2
17 4 4 4 4 5 4 4 4 3 3 2 2 4 5 3 2
15 5 5 5 4 4 3 4 2 5 4 2 1 3 2 5 2
17 5 5 4 4 4 4 3 4 5 5 2 1 3 3 4 NA
18 4 5 4 4 4 1 3 2 5 5 3 1 2 4 4 3
17 5 5 4 4 4 5 5 4 5 5 4 4 4 5 4 2
20 4 4 4 4 5 4 4 3 4 4 2 4 5 5 3 3
15 5 5 5 5 5 3 3 5 4 5 2 3 4 2 5 2
16 4 3 4 3 4 5 3 2 4 4 1 4 4 4 4 2
15 4 5 4 4 NA 4 3 4 5 4 3 1 4 4 4 2
18 3 3 2 5 4 3 3 3 4 4 3 5 3 5 4 5
11 2 3 4 4 4 NA NA NA NA NA 4 3 4 2 4 3
15 4 5 4 4 3 4 3 3 4 4 3 2 3 4 5 3
18 4 5 5 4 4 4 2 4 5 5 1 3 NA 1 5 1
20 4 4 4 4 5 3 4 5 2 2 1 3 1 2 5 3
19 4 5 NA 4 4 2 4 3 5 5 2 1 2 2 5 2
14 5 5 5 4 4 4 4 2 4 4 1 4 1 1 4 3
16 5 5 4 NA 5 3 5 5 5 5 5 1 4 4 4 3
15 3 5 5 4 3 3 2 4 5 5 3 1 5 3 5 3
17 4 5 4 3 4 4 2 4 4 4 2 3 4 4 5 3
18 4 5 4 4 1 2 3 2 5 4 2 3 3 1 4 2
20 5 5 4 3 5 3 3 5 4 2 4 2 2 4 5 4
17 4 5 4 4 4 4 2 3 5 5 2 4 1 2 5 2
18 5 5 5 5 5 4 4 3 5 5 4 4 3 3 5 1
15 3 4 4 3 3 3 2 3 5 5 4 2 4 3 5 2
16 5 5 5 5 4 4 3 4 4 4 3 4 4 5 5 4
11 5 5 5 4 4 4 NA 4 5 5 4 4 1 5 5 4
15 3 5 5 3 4 3 3 4 5 5 3 2 5 5 5 4
18 5 5 5 4 4 2 3 4 5 4 4 1 3 4 3 3
17 4 5 4 4 5 4 4 4 5 5 3 1 NA 2 4 2
16 5 5 5 4 5 2 2 4 5 5 4 1 4 2 5 4
12 5 5 5 5 5 3 5 5 2 2 2 3 1 1 3 2
19 5 4 5 5 5 4 4 3 5 5 4 3 3 2 4 5
18 5 5 5 4 4 3 3 NA 3 3 1 4 3 4 NA 2
15 4 5 4 3 5 2 5 4 5 5 4 1 4 2 5 3
17 5 4 5 4 5 4 2 4 5 4 3 NA 4 3 2 2
19 5 4 2 5 4 1 4 5 5 5 2 3 5 5 5 3
18 4 5 4 4 3 5 4 3 4 4 2 3 1 1 3 3
19 4 5 5 4 4 4 4 4 5 5 2 NA NA 5 5 4
16 4 4 5 3 4 3 3 2 5 5 4 1 1 1 1 2
16 4 5 4 4 5 4 5 5 5 5 3 2 5 3 5 4
16 4 4 4 3 4 4 3 4 5 4 3 2 3 4 5 2
14 5 5 5 3 4 3 3 3 5 2 2 4 4 3 5 5




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time14 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time14 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315683&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]14 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=315683&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time14 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 13.2072 -0.0169155IK1[t] + 0.720896IK2[t] -0.0543074IK3[t] -0.463071IK4[t] + 0.19166KVDD1[t] -0.0978236KVDD2[t] + 0.270208KVDD3[t] -0.185305KVDD4[t] -0.460079EP1[t] + 0.575924EP2[t] -0.460701EP3[t] + 0.262505EP4[t] -0.203569EC1[t] + 0.499368EC2[t] + 0.156823EC3[t] + 0.0576426EC4[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
ITHSUM[t] =  +  13.2072 -0.0169155IK1[t] +  0.720896IK2[t] -0.0543074IK3[t] -0.463071IK4[t] +  0.19166KVDD1[t] -0.0978236KVDD2[t] +  0.270208KVDD3[t] -0.185305KVDD4[t] -0.460079EP1[t] +  0.575924EP2[t] -0.460701EP3[t] +  0.262505EP4[t] -0.203569EC1[t] +  0.499368EC2[t] +  0.156823EC3[t] +  0.0576426EC4[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315683&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]ITHSUM[t] =  +  13.2072 -0.0169155IK1[t] +  0.720896IK2[t] -0.0543074IK3[t] -0.463071IK4[t] +  0.19166KVDD1[t] -0.0978236KVDD2[t] +  0.270208KVDD3[t] -0.185305KVDD4[t] -0.460079EP1[t] +  0.575924EP2[t] -0.460701EP3[t] +  0.262505EP4[t] -0.203569EC1[t] +  0.499368EC2[t] +  0.156823EC3[t] +  0.0576426EC4[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315683&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 13.2072 -0.0169155IK1[t] + 0.720896IK2[t] -0.0543074IK3[t] -0.463071IK4[t] + 0.19166KVDD1[t] -0.0978236KVDD2[t] + 0.270208KVDD3[t] -0.185305KVDD4[t] -0.460079EP1[t] + 0.575924EP2[t] -0.460701EP3[t] + 0.262505EP4[t] -0.203569EC1[t] + 0.499368EC2[t] + 0.156823EC3[t] + 0.0576426EC4[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+13.21 2.964+4.4560e+00 1.952e-05 9.758e-06
IK1-0.01692 0.4504-3.7560e-02 0.9701 0.4851
IK2+0.7209 0.4297+1.6780e+00 0.09612 0.04806
IK3-0.05431 0.4271-1.2720e-01 0.899 0.4495
IK4-0.4631 0.3948-1.1730e+00 0.2433 0.1216
KVDD1+0.1917 0.3327+5.7620e-01 0.5656 0.2828
KVDD2-0.09782 0.2461-3.9750e-01 0.6917 0.3459
KVDD3+0.2702 0.2522+1.0710e+00 0.2862 0.1431
KVDD4-0.1853 0.2578-7.1870e-01 0.4738 0.2369
EP1-0.4601 0.439-1.0480e+00 0.2968 0.1484
EP2+0.5759 0.3906+1.4740e+00 0.1431 0.07156
EP3-0.4607 0.2302-2.0010e+00 0.0477 0.02385
EP4+0.2625 0.2037+1.2890e+00 0.2001 0.1
EC1-0.2036 0.2238-9.0940e-01 0.365 0.1825
EC2+0.4994 0.2023+2.4690e+00 0.01503 0.007515
EC3+0.1568 0.3356+4.6730e-01 0.6412 0.3206
EC4+0.05764 0.2595+2.2210e-01 0.8246 0.4123

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +13.21 &  2.964 & +4.4560e+00 &  1.952e-05 &  9.758e-06 \tabularnewline
IK1 & -0.01692 &  0.4504 & -3.7560e-02 &  0.9701 &  0.4851 \tabularnewline
IK2 & +0.7209 &  0.4297 & +1.6780e+00 &  0.09612 &  0.04806 \tabularnewline
IK3 & -0.05431 &  0.4271 & -1.2720e-01 &  0.899 &  0.4495 \tabularnewline
IK4 & -0.4631 &  0.3948 & -1.1730e+00 &  0.2433 &  0.1216 \tabularnewline
KVDD1 & +0.1917 &  0.3327 & +5.7620e-01 &  0.5656 &  0.2828 \tabularnewline
KVDD2 & -0.09782 &  0.2461 & -3.9750e-01 &  0.6917 &  0.3459 \tabularnewline
KVDD3 & +0.2702 &  0.2522 & +1.0710e+00 &  0.2862 &  0.1431 \tabularnewline
KVDD4 & -0.1853 &  0.2578 & -7.1870e-01 &  0.4738 &  0.2369 \tabularnewline
EP1 & -0.4601 &  0.439 & -1.0480e+00 &  0.2968 &  0.1484 \tabularnewline
EP2 & +0.5759 &  0.3906 & +1.4740e+00 &  0.1431 &  0.07156 \tabularnewline
EP3 & -0.4607 &  0.2302 & -2.0010e+00 &  0.0477 &  0.02385 \tabularnewline
EP4 & +0.2625 &  0.2037 & +1.2890e+00 &  0.2001 &  0.1 \tabularnewline
EC1 & -0.2036 &  0.2238 & -9.0940e-01 &  0.365 &  0.1825 \tabularnewline
EC2 & +0.4994 &  0.2023 & +2.4690e+00 &  0.01503 &  0.007515 \tabularnewline
EC3 & +0.1568 &  0.3356 & +4.6730e-01 &  0.6412 &  0.3206 \tabularnewline
EC4 & +0.05764 &  0.2595 & +2.2210e-01 &  0.8246 &  0.4123 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315683&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+13.21[/C][C] 2.964[/C][C]+4.4560e+00[/C][C] 1.952e-05[/C][C] 9.758e-06[/C][/ROW]
[ROW][C]IK1[/C][C]-0.01692[/C][C] 0.4504[/C][C]-3.7560e-02[/C][C] 0.9701[/C][C] 0.4851[/C][/ROW]
[ROW][C]IK2[/C][C]+0.7209[/C][C] 0.4297[/C][C]+1.6780e+00[/C][C] 0.09612[/C][C] 0.04806[/C][/ROW]
[ROW][C]IK3[/C][C]-0.05431[/C][C] 0.4271[/C][C]-1.2720e-01[/C][C] 0.899[/C][C] 0.4495[/C][/ROW]
[ROW][C]IK4[/C][C]-0.4631[/C][C] 0.3948[/C][C]-1.1730e+00[/C][C] 0.2433[/C][C] 0.1216[/C][/ROW]
[ROW][C]KVDD1[/C][C]+0.1917[/C][C] 0.3327[/C][C]+5.7620e-01[/C][C] 0.5656[/C][C] 0.2828[/C][/ROW]
[ROW][C]KVDD2[/C][C]-0.09782[/C][C] 0.2461[/C][C]-3.9750e-01[/C][C] 0.6917[/C][C] 0.3459[/C][/ROW]
[ROW][C]KVDD3[/C][C]+0.2702[/C][C] 0.2522[/C][C]+1.0710e+00[/C][C] 0.2862[/C][C] 0.1431[/C][/ROW]
[ROW][C]KVDD4[/C][C]-0.1853[/C][C] 0.2578[/C][C]-7.1870e-01[/C][C] 0.4738[/C][C] 0.2369[/C][/ROW]
[ROW][C]EP1[/C][C]-0.4601[/C][C] 0.439[/C][C]-1.0480e+00[/C][C] 0.2968[/C][C] 0.1484[/C][/ROW]
[ROW][C]EP2[/C][C]+0.5759[/C][C] 0.3906[/C][C]+1.4740e+00[/C][C] 0.1431[/C][C] 0.07156[/C][/ROW]
[ROW][C]EP3[/C][C]-0.4607[/C][C] 0.2302[/C][C]-2.0010e+00[/C][C] 0.0477[/C][C] 0.02385[/C][/ROW]
[ROW][C]EP4[/C][C]+0.2625[/C][C] 0.2037[/C][C]+1.2890e+00[/C][C] 0.2001[/C][C] 0.1[/C][/ROW]
[ROW][C]EC1[/C][C]-0.2036[/C][C] 0.2238[/C][C]-9.0940e-01[/C][C] 0.365[/C][C] 0.1825[/C][/ROW]
[ROW][C]EC2[/C][C]+0.4994[/C][C] 0.2023[/C][C]+2.4690e+00[/C][C] 0.01503[/C][C] 0.007515[/C][/ROW]
[ROW][C]EC3[/C][C]+0.1568[/C][C] 0.3356[/C][C]+4.6730e-01[/C][C] 0.6412[/C][C] 0.3206[/C][/ROW]
[ROW][C]EC4[/C][C]+0.05764[/C][C] 0.2595[/C][C]+2.2210e-01[/C][C] 0.8246[/C][C] 0.4123[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315683&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+13.21 2.964+4.4560e+00 1.952e-05 9.758e-06
IK1-0.01692 0.4504-3.7560e-02 0.9701 0.4851
IK2+0.7209 0.4297+1.6780e+00 0.09612 0.04806
IK3-0.05431 0.4271-1.2720e-01 0.899 0.4495
IK4-0.4631 0.3948-1.1730e+00 0.2433 0.1216
KVDD1+0.1917 0.3327+5.7620e-01 0.5656 0.2828
KVDD2-0.09782 0.2461-3.9750e-01 0.6917 0.3459
KVDD3+0.2702 0.2522+1.0710e+00 0.2862 0.1431
KVDD4-0.1853 0.2578-7.1870e-01 0.4738 0.2369
EP1-0.4601 0.439-1.0480e+00 0.2968 0.1484
EP2+0.5759 0.3906+1.4740e+00 0.1431 0.07156
EP3-0.4607 0.2302-2.0010e+00 0.0477 0.02385
EP4+0.2625 0.2037+1.2890e+00 0.2001 0.1
EC1-0.2036 0.2238-9.0940e-01 0.365 0.1825
EC2+0.4994 0.2023+2.4690e+00 0.01503 0.007515
EC3+0.1568 0.3356+4.6730e-01 0.6412 0.3206
EC4+0.05764 0.2595+2.2210e-01 0.8246 0.4123







Multiple Linear Regression - Regression Statistics
Multiple R 0.3997
R-squared 0.1597
Adjusted R-squared 0.04283
F-TEST (value) 1.366
F-TEST (DF numerator)16
F-TEST (DF denominator)115
p-value 0.1709
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.431
Sum Squared Residuals 679.6

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.3997 \tabularnewline
R-squared &  0.1597 \tabularnewline
Adjusted R-squared &  0.04283 \tabularnewline
F-TEST (value) &  1.366 \tabularnewline
F-TEST (DF numerator) & 16 \tabularnewline
F-TEST (DF denominator) & 115 \tabularnewline
p-value &  0.1709 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  2.431 \tabularnewline
Sum Squared Residuals &  679.6 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315683&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.3997[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.1597[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.04283[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 1.366[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]16[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]115[/C][/ROW]
[ROW][C]p-value[/C][C] 0.1709[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 2.431[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 679.6[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315683&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.3997
R-squared 0.1597
Adjusted R-squared 0.04283
F-TEST (value) 1.366
F-TEST (DF numerator)16
F-TEST (DF denominator)115
p-value 0.1709
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.431
Sum Squared Residuals 679.6







Menu of Residual Diagnostics
DescriptionLink
HistogramCompute
Central TendencyCompute
QQ PlotCompute
Kernel Density PlotCompute
Skewness/Kurtosis TestCompute
Skewness-Kurtosis PlotCompute
Harrell-Davis PlotCompute
Bootstrap Plot -- Central TendencyCompute
Blocked Bootstrap Plot -- Central TendencyCompute
(Partial) Autocorrelation PlotCompute
Spectral AnalysisCompute
Tukey lambda PPCC PlotCompute
Box-Cox Normality PlotCompute
Summary StatisticsCompute

\begin{tabular}{lllllllll}
\hline
Menu of Residual Diagnostics \tabularnewline
Description & Link \tabularnewline
Histogram & Compute \tabularnewline
Central Tendency & Compute \tabularnewline
QQ Plot & Compute \tabularnewline
Kernel Density Plot & Compute \tabularnewline
Skewness/Kurtosis Test & Compute \tabularnewline
Skewness-Kurtosis Plot & Compute \tabularnewline
Harrell-Davis Plot & Compute \tabularnewline
Bootstrap Plot -- Central Tendency & Compute \tabularnewline
Blocked Bootstrap Plot -- Central Tendency & Compute \tabularnewline
(Partial) Autocorrelation Plot & Compute \tabularnewline
Spectral Analysis & Compute \tabularnewline
Tukey lambda PPCC Plot & Compute \tabularnewline
Box-Cox Normality Plot & Compute \tabularnewline
Summary Statistics & Compute \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315683&T=4

[TABLE]
[ROW][C]Menu of Residual Diagnostics[/C][/ROW]
[ROW][C]Description[/C][C]Link[/C][/ROW]
[ROW][C]Histogram[/C][C]Compute[/C][/ROW]
[ROW][C]Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C]QQ Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Kernel Density Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Skewness/Kurtosis Test[/C][C]Compute[/C][/ROW]
[ROW][C]Skewness-Kurtosis Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Harrell-Davis Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Bootstrap Plot -- Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C]Blocked Bootstrap Plot -- Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C](Partial) Autocorrelation Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Spectral Analysis[/C][C]Compute[/C][/ROW]
[ROW][C]Tukey lambda PPCC Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Box-Cox Normality Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Summary Statistics[/C][C]Compute[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315683&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Menu of Residual Diagnostics
DescriptionLink
HistogramCompute
Central TendencyCompute
QQ PlotCompute
Kernel Density PlotCompute
Skewness/Kurtosis TestCompute
Skewness-Kurtosis PlotCompute
Harrell-Davis PlotCompute
Bootstrap Plot -- Central TendencyCompute
Blocked Bootstrap Plot -- Central TendencyCompute
(Partial) Autocorrelation PlotCompute
Spectral AnalysisCompute
Tukey lambda PPCC PlotCompute
Box-Cox Normality PlotCompute
Summary StatisticsCompute







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 15.92-1.922
2 17 16.66 0.3396
3 17 15.94 1.065
4 15 16.29-1.286
5 20 18.24 1.762
6 15 17.02-2.019
7 15 15.26-0.2605
8 19 16.38 2.617
9 20 17.37 2.633
10 18 19.5-1.5
11 15 15.32-0.3197
12 14 16.85-2.855
13 16 16.91-0.905
14 19 17.35 1.653
15 19 16.37 2.629
16 16 16.41-0.4061
17 18 18.02-0.024
18 17 15.2 1.803
19 19 16.47 2.528
20 17 17.22-0.2153
21 19 16.22 2.784
22 20 15.46 4.535
23 5 14.66-9.655
24 19 17.26 1.743
25 16 17.42-1.417
26 15 16.01-1.009
27 16 16.22-0.223
28 18 15.87 2.13
29 16 16.92-0.9161
30 15 15.6-0.5992
31 17 14.96 2.037
32 20 17.72 2.284
33 19 15.78 3.223
34 7 14.69-7.689
35 13 15.52-2.516
36 16 16.2-0.1969
37 18 16.87 1.134
38 18 15.97 2.025
39 16 17.14-1.14
40 17 17.2-0.1988
41 19 16.08 2.92
42 16 14.9 1.101
43 13 15.56-2.559
44 12 17.08-5.077
45 17 16.65 0.3469
46 17 15.41 1.591
47 16 17.11-1.108
48 16 15.89 0.1111
49 14 14.63-0.6327
50 13 14.62-1.623
51 16 14.59 1.41
52 14 16.23-2.232
53 20 17.02 2.981
54 13 15.91-2.907
55 18 17.32 0.6772
56 14 16.84-2.835
57 19 14.77 4.233
58 18 16.43 1.57
59 14 15.68-1.677
60 18 18.25-0.2461
61 19 15.98 3.021
62 15 16.46-1.464
63 14 17.51-3.508
64 19 17.95 1.051
65 13 17.13-4.129
66 19 18.12 0.8791
67 18 17.04 0.9608
68 20 16.64 3.36
69 15 16.16-1.16
70 15 16.51-1.507
71 15 16.25-1.246
72 20 17.35 2.648
73 15 16.32-1.318
74 19 18.07 0.9327
75 18 17.43 0.5658
76 18 15.63 2.365
77 15 16.72-1.72
78 20 16.85 3.155
79 17 15.68 1.315
80 12 15.06-3.061
81 18 16.29 1.712
82 19 18.58 0.4157
83 20 16.64 3.357
84 17 17.22-0.2203
85 16 16.02-0.01731
86 18 17.2 0.8031
87 14 16.78-2.779
88 15 14.86 0.1375
89 12 14.97-2.973
90 17 14.83 2.17
91 18 17.39 0.6078
92 17 16.22 0.7825
93 17 18.84-1.843
94 20 17.09 2.906
95 16 17.12-1.122
96 14 16.36-2.355
97 15 15 0.0006962
98 18 17.65 0.3482
99 20 17.33 2.675
100 17 17.22-0.2162
101 17 17.78-0.7793
102 17 17.08-0.07971
103 15 16.42-1.419
104 18 17.63 0.3665
105 17 17.76-0.7565
106 20 17.76 2.24
107 15 16.68-1.678
108 16 17.39-1.392
109 18 16.82 1.181
110 15 17.06-2.063
111 20 17.08 2.917
112 14 17.84-3.838
113 15 15.61-0.6148
114 17 17.78-0.7819
115 18 15.61 2.389
116 20 16.27 3.732
117 17 17.44-0.4368
118 18 16.75 1.252
119 15 15.54-0.5443
120 16 17.41-1.414
121 15 17.86-2.859
122 18 15.7 2.303
123 16 15.36 0.6368
124 12 16.21-4.209
125 19 15.34 3.661
126 15 16.65-1.651
127 19 17.29 1.713
128 18 16.55 1.446
129 16 14.84 1.158
130 16 16.88-0.8831
131 16 16.29-0.2938
132 14 16.53-2.53

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  14 &  15.92 & -1.922 \tabularnewline
2 &  17 &  16.66 &  0.3396 \tabularnewline
3 &  17 &  15.94 &  1.065 \tabularnewline
4 &  15 &  16.29 & -1.286 \tabularnewline
5 &  20 &  18.24 &  1.762 \tabularnewline
6 &  15 &  17.02 & -2.019 \tabularnewline
7 &  15 &  15.26 & -0.2605 \tabularnewline
8 &  19 &  16.38 &  2.617 \tabularnewline
9 &  20 &  17.37 &  2.633 \tabularnewline
10 &  18 &  19.5 & -1.5 \tabularnewline
11 &  15 &  15.32 & -0.3197 \tabularnewline
12 &  14 &  16.85 & -2.855 \tabularnewline
13 &  16 &  16.91 & -0.905 \tabularnewline
14 &  19 &  17.35 &  1.653 \tabularnewline
15 &  19 &  16.37 &  2.629 \tabularnewline
16 &  16 &  16.41 & -0.4061 \tabularnewline
17 &  18 &  18.02 & -0.024 \tabularnewline
18 &  17 &  15.2 &  1.803 \tabularnewline
19 &  19 &  16.47 &  2.528 \tabularnewline
20 &  17 &  17.22 & -0.2153 \tabularnewline
21 &  19 &  16.22 &  2.784 \tabularnewline
22 &  20 &  15.46 &  4.535 \tabularnewline
23 &  5 &  14.66 & -9.655 \tabularnewline
24 &  19 &  17.26 &  1.743 \tabularnewline
25 &  16 &  17.42 & -1.417 \tabularnewline
26 &  15 &  16.01 & -1.009 \tabularnewline
27 &  16 &  16.22 & -0.223 \tabularnewline
28 &  18 &  15.87 &  2.13 \tabularnewline
29 &  16 &  16.92 & -0.9161 \tabularnewline
30 &  15 &  15.6 & -0.5992 \tabularnewline
31 &  17 &  14.96 &  2.037 \tabularnewline
32 &  20 &  17.72 &  2.284 \tabularnewline
33 &  19 &  15.78 &  3.223 \tabularnewline
34 &  7 &  14.69 & -7.689 \tabularnewline
35 &  13 &  15.52 & -2.516 \tabularnewline
36 &  16 &  16.2 & -0.1969 \tabularnewline
37 &  18 &  16.87 &  1.134 \tabularnewline
38 &  18 &  15.97 &  2.025 \tabularnewline
39 &  16 &  17.14 & -1.14 \tabularnewline
40 &  17 &  17.2 & -0.1988 \tabularnewline
41 &  19 &  16.08 &  2.92 \tabularnewline
42 &  16 &  14.9 &  1.101 \tabularnewline
43 &  13 &  15.56 & -2.559 \tabularnewline
44 &  12 &  17.08 & -5.077 \tabularnewline
45 &  17 &  16.65 &  0.3469 \tabularnewline
46 &  17 &  15.41 &  1.591 \tabularnewline
47 &  16 &  17.11 & -1.108 \tabularnewline
48 &  16 &  15.89 &  0.1111 \tabularnewline
49 &  14 &  14.63 & -0.6327 \tabularnewline
50 &  13 &  14.62 & -1.623 \tabularnewline
51 &  16 &  14.59 &  1.41 \tabularnewline
52 &  14 &  16.23 & -2.232 \tabularnewline
53 &  20 &  17.02 &  2.981 \tabularnewline
54 &  13 &  15.91 & -2.907 \tabularnewline
55 &  18 &  17.32 &  0.6772 \tabularnewline
56 &  14 &  16.84 & -2.835 \tabularnewline
57 &  19 &  14.77 &  4.233 \tabularnewline
58 &  18 &  16.43 &  1.57 \tabularnewline
59 &  14 &  15.68 & -1.677 \tabularnewline
60 &  18 &  18.25 & -0.2461 \tabularnewline
61 &  19 &  15.98 &  3.021 \tabularnewline
62 &  15 &  16.46 & -1.464 \tabularnewline
63 &  14 &  17.51 & -3.508 \tabularnewline
64 &  19 &  17.95 &  1.051 \tabularnewline
65 &  13 &  17.13 & -4.129 \tabularnewline
66 &  19 &  18.12 &  0.8791 \tabularnewline
67 &  18 &  17.04 &  0.9608 \tabularnewline
68 &  20 &  16.64 &  3.36 \tabularnewline
69 &  15 &  16.16 & -1.16 \tabularnewline
70 &  15 &  16.51 & -1.507 \tabularnewline
71 &  15 &  16.25 & -1.246 \tabularnewline
72 &  20 &  17.35 &  2.648 \tabularnewline
73 &  15 &  16.32 & -1.318 \tabularnewline
74 &  19 &  18.07 &  0.9327 \tabularnewline
75 &  18 &  17.43 &  0.5658 \tabularnewline
76 &  18 &  15.63 &  2.365 \tabularnewline
77 &  15 &  16.72 & -1.72 \tabularnewline
78 &  20 &  16.85 &  3.155 \tabularnewline
79 &  17 &  15.68 &  1.315 \tabularnewline
80 &  12 &  15.06 & -3.061 \tabularnewline
81 &  18 &  16.29 &  1.712 \tabularnewline
82 &  19 &  18.58 &  0.4157 \tabularnewline
83 &  20 &  16.64 &  3.357 \tabularnewline
84 &  17 &  17.22 & -0.2203 \tabularnewline
85 &  16 &  16.02 & -0.01731 \tabularnewline
86 &  18 &  17.2 &  0.8031 \tabularnewline
87 &  14 &  16.78 & -2.779 \tabularnewline
88 &  15 &  14.86 &  0.1375 \tabularnewline
89 &  12 &  14.97 & -2.973 \tabularnewline
90 &  17 &  14.83 &  2.17 \tabularnewline
91 &  18 &  17.39 &  0.6078 \tabularnewline
92 &  17 &  16.22 &  0.7825 \tabularnewline
93 &  17 &  18.84 & -1.843 \tabularnewline
94 &  20 &  17.09 &  2.906 \tabularnewline
95 &  16 &  17.12 & -1.122 \tabularnewline
96 &  14 &  16.36 & -2.355 \tabularnewline
97 &  15 &  15 &  0.0006962 \tabularnewline
98 &  18 &  17.65 &  0.3482 \tabularnewline
99 &  20 &  17.33 &  2.675 \tabularnewline
100 &  17 &  17.22 & -0.2162 \tabularnewline
101 &  17 &  17.78 & -0.7793 \tabularnewline
102 &  17 &  17.08 & -0.07971 \tabularnewline
103 &  15 &  16.42 & -1.419 \tabularnewline
104 &  18 &  17.63 &  0.3665 \tabularnewline
105 &  17 &  17.76 & -0.7565 \tabularnewline
106 &  20 &  17.76 &  2.24 \tabularnewline
107 &  15 &  16.68 & -1.678 \tabularnewline
108 &  16 &  17.39 & -1.392 \tabularnewline
109 &  18 &  16.82 &  1.181 \tabularnewline
110 &  15 &  17.06 & -2.063 \tabularnewline
111 &  20 &  17.08 &  2.917 \tabularnewline
112 &  14 &  17.84 & -3.838 \tabularnewline
113 &  15 &  15.61 & -0.6148 \tabularnewline
114 &  17 &  17.78 & -0.7819 \tabularnewline
115 &  18 &  15.61 &  2.389 \tabularnewline
116 &  20 &  16.27 &  3.732 \tabularnewline
117 &  17 &  17.44 & -0.4368 \tabularnewline
118 &  18 &  16.75 &  1.252 \tabularnewline
119 &  15 &  15.54 & -0.5443 \tabularnewline
120 &  16 &  17.41 & -1.414 \tabularnewline
121 &  15 &  17.86 & -2.859 \tabularnewline
122 &  18 &  15.7 &  2.303 \tabularnewline
123 &  16 &  15.36 &  0.6368 \tabularnewline
124 &  12 &  16.21 & -4.209 \tabularnewline
125 &  19 &  15.34 &  3.661 \tabularnewline
126 &  15 &  16.65 & -1.651 \tabularnewline
127 &  19 &  17.29 &  1.713 \tabularnewline
128 &  18 &  16.55 &  1.446 \tabularnewline
129 &  16 &  14.84 &  1.158 \tabularnewline
130 &  16 &  16.88 & -0.8831 \tabularnewline
131 &  16 &  16.29 & -0.2938 \tabularnewline
132 &  14 &  16.53 & -2.53 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315683&T=5

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 14[/C][C] 15.92[/C][C]-1.922[/C][/ROW]
[ROW][C]2[/C][C] 17[/C][C] 16.66[/C][C] 0.3396[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 15.94[/C][C] 1.065[/C][/ROW]
[ROW][C]4[/C][C] 15[/C][C] 16.29[/C][C]-1.286[/C][/ROW]
[ROW][C]5[/C][C] 20[/C][C] 18.24[/C][C] 1.762[/C][/ROW]
[ROW][C]6[/C][C] 15[/C][C] 17.02[/C][C]-2.019[/C][/ROW]
[ROW][C]7[/C][C] 15[/C][C] 15.26[/C][C]-0.2605[/C][/ROW]
[ROW][C]8[/C][C] 19[/C][C] 16.38[/C][C] 2.617[/C][/ROW]
[ROW][C]9[/C][C] 20[/C][C] 17.37[/C][C] 2.633[/C][/ROW]
[ROW][C]10[/C][C] 18[/C][C] 19.5[/C][C]-1.5[/C][/ROW]
[ROW][C]11[/C][C] 15[/C][C] 15.32[/C][C]-0.3197[/C][/ROW]
[ROW][C]12[/C][C] 14[/C][C] 16.85[/C][C]-2.855[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 16.91[/C][C]-0.905[/C][/ROW]
[ROW][C]14[/C][C] 19[/C][C] 17.35[/C][C] 1.653[/C][/ROW]
[ROW][C]15[/C][C] 19[/C][C] 16.37[/C][C] 2.629[/C][/ROW]
[ROW][C]16[/C][C] 16[/C][C] 16.41[/C][C]-0.4061[/C][/ROW]
[ROW][C]17[/C][C] 18[/C][C] 18.02[/C][C]-0.024[/C][/ROW]
[ROW][C]18[/C][C] 17[/C][C] 15.2[/C][C] 1.803[/C][/ROW]
[ROW][C]19[/C][C] 19[/C][C] 16.47[/C][C] 2.528[/C][/ROW]
[ROW][C]20[/C][C] 17[/C][C] 17.22[/C][C]-0.2153[/C][/ROW]
[ROW][C]21[/C][C] 19[/C][C] 16.22[/C][C] 2.784[/C][/ROW]
[ROW][C]22[/C][C] 20[/C][C] 15.46[/C][C] 4.535[/C][/ROW]
[ROW][C]23[/C][C] 5[/C][C] 14.66[/C][C]-9.655[/C][/ROW]
[ROW][C]24[/C][C] 19[/C][C] 17.26[/C][C] 1.743[/C][/ROW]
[ROW][C]25[/C][C] 16[/C][C] 17.42[/C][C]-1.417[/C][/ROW]
[ROW][C]26[/C][C] 15[/C][C] 16.01[/C][C]-1.009[/C][/ROW]
[ROW][C]27[/C][C] 16[/C][C] 16.22[/C][C]-0.223[/C][/ROW]
[ROW][C]28[/C][C] 18[/C][C] 15.87[/C][C] 2.13[/C][/ROW]
[ROW][C]29[/C][C] 16[/C][C] 16.92[/C][C]-0.9161[/C][/ROW]
[ROW][C]30[/C][C] 15[/C][C] 15.6[/C][C]-0.5992[/C][/ROW]
[ROW][C]31[/C][C] 17[/C][C] 14.96[/C][C] 2.037[/C][/ROW]
[ROW][C]32[/C][C] 20[/C][C] 17.72[/C][C] 2.284[/C][/ROW]
[ROW][C]33[/C][C] 19[/C][C] 15.78[/C][C] 3.223[/C][/ROW]
[ROW][C]34[/C][C] 7[/C][C] 14.69[/C][C]-7.689[/C][/ROW]
[ROW][C]35[/C][C] 13[/C][C] 15.52[/C][C]-2.516[/C][/ROW]
[ROW][C]36[/C][C] 16[/C][C] 16.2[/C][C]-0.1969[/C][/ROW]
[ROW][C]37[/C][C] 18[/C][C] 16.87[/C][C] 1.134[/C][/ROW]
[ROW][C]38[/C][C] 18[/C][C] 15.97[/C][C] 2.025[/C][/ROW]
[ROW][C]39[/C][C] 16[/C][C] 17.14[/C][C]-1.14[/C][/ROW]
[ROW][C]40[/C][C] 17[/C][C] 17.2[/C][C]-0.1988[/C][/ROW]
[ROW][C]41[/C][C] 19[/C][C] 16.08[/C][C] 2.92[/C][/ROW]
[ROW][C]42[/C][C] 16[/C][C] 14.9[/C][C] 1.101[/C][/ROW]
[ROW][C]43[/C][C] 13[/C][C] 15.56[/C][C]-2.559[/C][/ROW]
[ROW][C]44[/C][C] 12[/C][C] 17.08[/C][C]-5.077[/C][/ROW]
[ROW][C]45[/C][C] 17[/C][C] 16.65[/C][C] 0.3469[/C][/ROW]
[ROW][C]46[/C][C] 17[/C][C] 15.41[/C][C] 1.591[/C][/ROW]
[ROW][C]47[/C][C] 16[/C][C] 17.11[/C][C]-1.108[/C][/ROW]
[ROW][C]48[/C][C] 16[/C][C] 15.89[/C][C] 0.1111[/C][/ROW]
[ROW][C]49[/C][C] 14[/C][C] 14.63[/C][C]-0.6327[/C][/ROW]
[ROW][C]50[/C][C] 13[/C][C] 14.62[/C][C]-1.623[/C][/ROW]
[ROW][C]51[/C][C] 16[/C][C] 14.59[/C][C] 1.41[/C][/ROW]
[ROW][C]52[/C][C] 14[/C][C] 16.23[/C][C]-2.232[/C][/ROW]
[ROW][C]53[/C][C] 20[/C][C] 17.02[/C][C] 2.981[/C][/ROW]
[ROW][C]54[/C][C] 13[/C][C] 15.91[/C][C]-2.907[/C][/ROW]
[ROW][C]55[/C][C] 18[/C][C] 17.32[/C][C] 0.6772[/C][/ROW]
[ROW][C]56[/C][C] 14[/C][C] 16.84[/C][C]-2.835[/C][/ROW]
[ROW][C]57[/C][C] 19[/C][C] 14.77[/C][C] 4.233[/C][/ROW]
[ROW][C]58[/C][C] 18[/C][C] 16.43[/C][C] 1.57[/C][/ROW]
[ROW][C]59[/C][C] 14[/C][C] 15.68[/C][C]-1.677[/C][/ROW]
[ROW][C]60[/C][C] 18[/C][C] 18.25[/C][C]-0.2461[/C][/ROW]
[ROW][C]61[/C][C] 19[/C][C] 15.98[/C][C] 3.021[/C][/ROW]
[ROW][C]62[/C][C] 15[/C][C] 16.46[/C][C]-1.464[/C][/ROW]
[ROW][C]63[/C][C] 14[/C][C] 17.51[/C][C]-3.508[/C][/ROW]
[ROW][C]64[/C][C] 19[/C][C] 17.95[/C][C] 1.051[/C][/ROW]
[ROW][C]65[/C][C] 13[/C][C] 17.13[/C][C]-4.129[/C][/ROW]
[ROW][C]66[/C][C] 19[/C][C] 18.12[/C][C] 0.8791[/C][/ROW]
[ROW][C]67[/C][C] 18[/C][C] 17.04[/C][C] 0.9608[/C][/ROW]
[ROW][C]68[/C][C] 20[/C][C] 16.64[/C][C] 3.36[/C][/ROW]
[ROW][C]69[/C][C] 15[/C][C] 16.16[/C][C]-1.16[/C][/ROW]
[ROW][C]70[/C][C] 15[/C][C] 16.51[/C][C]-1.507[/C][/ROW]
[ROW][C]71[/C][C] 15[/C][C] 16.25[/C][C]-1.246[/C][/ROW]
[ROW][C]72[/C][C] 20[/C][C] 17.35[/C][C] 2.648[/C][/ROW]
[ROW][C]73[/C][C] 15[/C][C] 16.32[/C][C]-1.318[/C][/ROW]
[ROW][C]74[/C][C] 19[/C][C] 18.07[/C][C] 0.9327[/C][/ROW]
[ROW][C]75[/C][C] 18[/C][C] 17.43[/C][C] 0.5658[/C][/ROW]
[ROW][C]76[/C][C] 18[/C][C] 15.63[/C][C] 2.365[/C][/ROW]
[ROW][C]77[/C][C] 15[/C][C] 16.72[/C][C]-1.72[/C][/ROW]
[ROW][C]78[/C][C] 20[/C][C] 16.85[/C][C] 3.155[/C][/ROW]
[ROW][C]79[/C][C] 17[/C][C] 15.68[/C][C] 1.315[/C][/ROW]
[ROW][C]80[/C][C] 12[/C][C] 15.06[/C][C]-3.061[/C][/ROW]
[ROW][C]81[/C][C] 18[/C][C] 16.29[/C][C] 1.712[/C][/ROW]
[ROW][C]82[/C][C] 19[/C][C] 18.58[/C][C] 0.4157[/C][/ROW]
[ROW][C]83[/C][C] 20[/C][C] 16.64[/C][C] 3.357[/C][/ROW]
[ROW][C]84[/C][C] 17[/C][C] 17.22[/C][C]-0.2203[/C][/ROW]
[ROW][C]85[/C][C] 16[/C][C] 16.02[/C][C]-0.01731[/C][/ROW]
[ROW][C]86[/C][C] 18[/C][C] 17.2[/C][C] 0.8031[/C][/ROW]
[ROW][C]87[/C][C] 14[/C][C] 16.78[/C][C]-2.779[/C][/ROW]
[ROW][C]88[/C][C] 15[/C][C] 14.86[/C][C] 0.1375[/C][/ROW]
[ROW][C]89[/C][C] 12[/C][C] 14.97[/C][C]-2.973[/C][/ROW]
[ROW][C]90[/C][C] 17[/C][C] 14.83[/C][C] 2.17[/C][/ROW]
[ROW][C]91[/C][C] 18[/C][C] 17.39[/C][C] 0.6078[/C][/ROW]
[ROW][C]92[/C][C] 17[/C][C] 16.22[/C][C] 0.7825[/C][/ROW]
[ROW][C]93[/C][C] 17[/C][C] 18.84[/C][C]-1.843[/C][/ROW]
[ROW][C]94[/C][C] 20[/C][C] 17.09[/C][C] 2.906[/C][/ROW]
[ROW][C]95[/C][C] 16[/C][C] 17.12[/C][C]-1.122[/C][/ROW]
[ROW][C]96[/C][C] 14[/C][C] 16.36[/C][C]-2.355[/C][/ROW]
[ROW][C]97[/C][C] 15[/C][C] 15[/C][C] 0.0006962[/C][/ROW]
[ROW][C]98[/C][C] 18[/C][C] 17.65[/C][C] 0.3482[/C][/ROW]
[ROW][C]99[/C][C] 20[/C][C] 17.33[/C][C] 2.675[/C][/ROW]
[ROW][C]100[/C][C] 17[/C][C] 17.22[/C][C]-0.2162[/C][/ROW]
[ROW][C]101[/C][C] 17[/C][C] 17.78[/C][C]-0.7793[/C][/ROW]
[ROW][C]102[/C][C] 17[/C][C] 17.08[/C][C]-0.07971[/C][/ROW]
[ROW][C]103[/C][C] 15[/C][C] 16.42[/C][C]-1.419[/C][/ROW]
[ROW][C]104[/C][C] 18[/C][C] 17.63[/C][C] 0.3665[/C][/ROW]
[ROW][C]105[/C][C] 17[/C][C] 17.76[/C][C]-0.7565[/C][/ROW]
[ROW][C]106[/C][C] 20[/C][C] 17.76[/C][C] 2.24[/C][/ROW]
[ROW][C]107[/C][C] 15[/C][C] 16.68[/C][C]-1.678[/C][/ROW]
[ROW][C]108[/C][C] 16[/C][C] 17.39[/C][C]-1.392[/C][/ROW]
[ROW][C]109[/C][C] 18[/C][C] 16.82[/C][C] 1.181[/C][/ROW]
[ROW][C]110[/C][C] 15[/C][C] 17.06[/C][C]-2.063[/C][/ROW]
[ROW][C]111[/C][C] 20[/C][C] 17.08[/C][C] 2.917[/C][/ROW]
[ROW][C]112[/C][C] 14[/C][C] 17.84[/C][C]-3.838[/C][/ROW]
[ROW][C]113[/C][C] 15[/C][C] 15.61[/C][C]-0.6148[/C][/ROW]
[ROW][C]114[/C][C] 17[/C][C] 17.78[/C][C]-0.7819[/C][/ROW]
[ROW][C]115[/C][C] 18[/C][C] 15.61[/C][C] 2.389[/C][/ROW]
[ROW][C]116[/C][C] 20[/C][C] 16.27[/C][C] 3.732[/C][/ROW]
[ROW][C]117[/C][C] 17[/C][C] 17.44[/C][C]-0.4368[/C][/ROW]
[ROW][C]118[/C][C] 18[/C][C] 16.75[/C][C] 1.252[/C][/ROW]
[ROW][C]119[/C][C] 15[/C][C] 15.54[/C][C]-0.5443[/C][/ROW]
[ROW][C]120[/C][C] 16[/C][C] 17.41[/C][C]-1.414[/C][/ROW]
[ROW][C]121[/C][C] 15[/C][C] 17.86[/C][C]-2.859[/C][/ROW]
[ROW][C]122[/C][C] 18[/C][C] 15.7[/C][C] 2.303[/C][/ROW]
[ROW][C]123[/C][C] 16[/C][C] 15.36[/C][C] 0.6368[/C][/ROW]
[ROW][C]124[/C][C] 12[/C][C] 16.21[/C][C]-4.209[/C][/ROW]
[ROW][C]125[/C][C] 19[/C][C] 15.34[/C][C] 3.661[/C][/ROW]
[ROW][C]126[/C][C] 15[/C][C] 16.65[/C][C]-1.651[/C][/ROW]
[ROW][C]127[/C][C] 19[/C][C] 17.29[/C][C] 1.713[/C][/ROW]
[ROW][C]128[/C][C] 18[/C][C] 16.55[/C][C] 1.446[/C][/ROW]
[ROW][C]129[/C][C] 16[/C][C] 14.84[/C][C] 1.158[/C][/ROW]
[ROW][C]130[/C][C] 16[/C][C] 16.88[/C][C]-0.8831[/C][/ROW]
[ROW][C]131[/C][C] 16[/C][C] 16.29[/C][C]-0.2938[/C][/ROW]
[ROW][C]132[/C][C] 14[/C][C] 16.53[/C][C]-2.53[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315683&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 15.92-1.922
2 17 16.66 0.3396
3 17 15.94 1.065
4 15 16.29-1.286
5 20 18.24 1.762
6 15 17.02-2.019
7 15 15.26-0.2605
8 19 16.38 2.617
9 20 17.37 2.633
10 18 19.5-1.5
11 15 15.32-0.3197
12 14 16.85-2.855
13 16 16.91-0.905
14 19 17.35 1.653
15 19 16.37 2.629
16 16 16.41-0.4061
17 18 18.02-0.024
18 17 15.2 1.803
19 19 16.47 2.528
20 17 17.22-0.2153
21 19 16.22 2.784
22 20 15.46 4.535
23 5 14.66-9.655
24 19 17.26 1.743
25 16 17.42-1.417
26 15 16.01-1.009
27 16 16.22-0.223
28 18 15.87 2.13
29 16 16.92-0.9161
30 15 15.6-0.5992
31 17 14.96 2.037
32 20 17.72 2.284
33 19 15.78 3.223
34 7 14.69-7.689
35 13 15.52-2.516
36 16 16.2-0.1969
37 18 16.87 1.134
38 18 15.97 2.025
39 16 17.14-1.14
40 17 17.2-0.1988
41 19 16.08 2.92
42 16 14.9 1.101
43 13 15.56-2.559
44 12 17.08-5.077
45 17 16.65 0.3469
46 17 15.41 1.591
47 16 17.11-1.108
48 16 15.89 0.1111
49 14 14.63-0.6327
50 13 14.62-1.623
51 16 14.59 1.41
52 14 16.23-2.232
53 20 17.02 2.981
54 13 15.91-2.907
55 18 17.32 0.6772
56 14 16.84-2.835
57 19 14.77 4.233
58 18 16.43 1.57
59 14 15.68-1.677
60 18 18.25-0.2461
61 19 15.98 3.021
62 15 16.46-1.464
63 14 17.51-3.508
64 19 17.95 1.051
65 13 17.13-4.129
66 19 18.12 0.8791
67 18 17.04 0.9608
68 20 16.64 3.36
69 15 16.16-1.16
70 15 16.51-1.507
71 15 16.25-1.246
72 20 17.35 2.648
73 15 16.32-1.318
74 19 18.07 0.9327
75 18 17.43 0.5658
76 18 15.63 2.365
77 15 16.72-1.72
78 20 16.85 3.155
79 17 15.68 1.315
80 12 15.06-3.061
81 18 16.29 1.712
82 19 18.58 0.4157
83 20 16.64 3.357
84 17 17.22-0.2203
85 16 16.02-0.01731
86 18 17.2 0.8031
87 14 16.78-2.779
88 15 14.86 0.1375
89 12 14.97-2.973
90 17 14.83 2.17
91 18 17.39 0.6078
92 17 16.22 0.7825
93 17 18.84-1.843
94 20 17.09 2.906
95 16 17.12-1.122
96 14 16.36-2.355
97 15 15 0.0006962
98 18 17.65 0.3482
99 20 17.33 2.675
100 17 17.22-0.2162
101 17 17.78-0.7793
102 17 17.08-0.07971
103 15 16.42-1.419
104 18 17.63 0.3665
105 17 17.76-0.7565
106 20 17.76 2.24
107 15 16.68-1.678
108 16 17.39-1.392
109 18 16.82 1.181
110 15 17.06-2.063
111 20 17.08 2.917
112 14 17.84-3.838
113 15 15.61-0.6148
114 17 17.78-0.7819
115 18 15.61 2.389
116 20 16.27 3.732
117 17 17.44-0.4368
118 18 16.75 1.252
119 15 15.54-0.5443
120 16 17.41-1.414
121 15 17.86-2.859
122 18 15.7 2.303
123 16 15.36 0.6368
124 12 16.21-4.209
125 19 15.34 3.661
126 15 16.65-1.651
127 19 17.29 1.713
128 18 16.55 1.446
129 16 14.84 1.158
130 16 16.88-0.8831
131 16 16.29-0.2938
132 14 16.53-2.53







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
20 0.2829 0.5658 0.7171
21 0.3154 0.6307 0.6846
22 0.5372 0.9255 0.4628
23 0.9728 0.05442 0.02721
24 0.955 0.09003 0.04501
25 0.9349 0.1302 0.06508
26 0.8993 0.2015 0.1007
27 0.8676 0.2648 0.1324
28 0.8207 0.3587 0.1793
29 0.8327 0.3345 0.1673
30 0.7781 0.4437 0.2219
31 0.7265 0.547 0.2735
32 0.7727 0.4546 0.2273
33 0.7772 0.4457 0.2228
34 0.9797 0.04055 0.02028
35 0.9755 0.04904 0.02452
36 0.9816 0.03685 0.01842
37 0.9842 0.0316 0.0158
38 0.9872 0.02567 0.01283
39 0.9823 0.03534 0.01767
40 0.9755 0.04899 0.02449
41 0.9756 0.04873 0.02436
42 0.9655 0.06902 0.03451
43 0.9718 0.05648 0.02824
44 0.9918 0.01633 0.008164
45 0.9881 0.02376 0.01188
46 0.9846 0.03075 0.01538
47 0.9862 0.02769 0.01384
48 0.9815 0.03693 0.01847
49 0.9746 0.05077 0.02539
50 0.9695 0.06108 0.03054
51 0.9621 0.07588 0.03794
52 0.9587 0.0826 0.0413
53 0.9583 0.08347 0.04174
54 0.9612 0.0775 0.03875
55 0.9491 0.1018 0.05088
56 0.9581 0.08387 0.04193
57 0.9794 0.04114 0.02057
58 0.9756 0.04879 0.0244
59 0.9699 0.0601 0.03005
60 0.9618 0.07649 0.03824
61 0.9643 0.07135 0.03568
62 0.9555 0.089 0.0445
63 0.9677 0.06463 0.03232
64 0.9625 0.07501 0.0375
65 0.9814 0.03715 0.01858
66 0.975 0.05008 0.02504
67 0.9674 0.06514 0.03257
68 0.9757 0.04865 0.02433
69 0.9702 0.05953 0.02976
70 0.9674 0.06529 0.03264
71 0.9582 0.08359 0.04179
72 0.9634 0.0731 0.03655
73 0.9578 0.08447 0.04224
74 0.948 0.104 0.05199
75 0.9349 0.1301 0.06506
76 0.9374 0.1252 0.06262
77 0.9265 0.1471 0.07353
78 0.9494 0.1012 0.05062
79 0.9423 0.1154 0.05769
80 0.9698 0.06043 0.03022
81 0.9622 0.07565 0.03783
82 0.9544 0.09121 0.0456
83 0.9619 0.07623 0.03812
84 0.9491 0.1017 0.05086
85 0.9353 0.1295 0.06473
86 0.913 0.1741 0.08703
87 0.9111 0.1777 0.08886
88 0.8959 0.2083 0.1041
89 0.9207 0.1585 0.07926
90 0.9154 0.1693 0.08464
91 0.8872 0.2256 0.1128
92 0.852 0.2959 0.148
93 0.8202 0.3597 0.1798
94 0.8018 0.3964 0.1982
95 0.7654 0.4692 0.2346
96 0.7895 0.421 0.2105
97 0.7968 0.4064 0.2032
98 0.7484 0.5033 0.2516
99 0.7622 0.4757 0.2378
100 0.6948 0.6105 0.3052
101 0.6186 0.7628 0.3814
102 0.5361 0.9279 0.4639
103 0.4694 0.9387 0.5306
104 0.3845 0.7689 0.6155
105 0.2994 0.5988 0.7006
106 0.466 0.932 0.534
107 0.3765 0.7531 0.6235
108 0.3257 0.6513 0.6743
109 0.2634 0.5267 0.7366
110 0.4725 0.9449 0.5275
111 0.7653 0.4694 0.2347
112 0.629 0.742 0.371

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
20 &  0.2829 &  0.5658 &  0.7171 \tabularnewline
21 &  0.3154 &  0.6307 &  0.6846 \tabularnewline
22 &  0.5372 &  0.9255 &  0.4628 \tabularnewline
23 &  0.9728 &  0.05442 &  0.02721 \tabularnewline
24 &  0.955 &  0.09003 &  0.04501 \tabularnewline
25 &  0.9349 &  0.1302 &  0.06508 \tabularnewline
26 &  0.8993 &  0.2015 &  0.1007 \tabularnewline
27 &  0.8676 &  0.2648 &  0.1324 \tabularnewline
28 &  0.8207 &  0.3587 &  0.1793 \tabularnewline
29 &  0.8327 &  0.3345 &  0.1673 \tabularnewline
30 &  0.7781 &  0.4437 &  0.2219 \tabularnewline
31 &  0.7265 &  0.547 &  0.2735 \tabularnewline
32 &  0.7727 &  0.4546 &  0.2273 \tabularnewline
33 &  0.7772 &  0.4457 &  0.2228 \tabularnewline
34 &  0.9797 &  0.04055 &  0.02028 \tabularnewline
35 &  0.9755 &  0.04904 &  0.02452 \tabularnewline
36 &  0.9816 &  0.03685 &  0.01842 \tabularnewline
37 &  0.9842 &  0.0316 &  0.0158 \tabularnewline
38 &  0.9872 &  0.02567 &  0.01283 \tabularnewline
39 &  0.9823 &  0.03534 &  0.01767 \tabularnewline
40 &  0.9755 &  0.04899 &  0.02449 \tabularnewline
41 &  0.9756 &  0.04873 &  0.02436 \tabularnewline
42 &  0.9655 &  0.06902 &  0.03451 \tabularnewline
43 &  0.9718 &  0.05648 &  0.02824 \tabularnewline
44 &  0.9918 &  0.01633 &  0.008164 \tabularnewline
45 &  0.9881 &  0.02376 &  0.01188 \tabularnewline
46 &  0.9846 &  0.03075 &  0.01538 \tabularnewline
47 &  0.9862 &  0.02769 &  0.01384 \tabularnewline
48 &  0.9815 &  0.03693 &  0.01847 \tabularnewline
49 &  0.9746 &  0.05077 &  0.02539 \tabularnewline
50 &  0.9695 &  0.06108 &  0.03054 \tabularnewline
51 &  0.9621 &  0.07588 &  0.03794 \tabularnewline
52 &  0.9587 &  0.0826 &  0.0413 \tabularnewline
53 &  0.9583 &  0.08347 &  0.04174 \tabularnewline
54 &  0.9612 &  0.0775 &  0.03875 \tabularnewline
55 &  0.9491 &  0.1018 &  0.05088 \tabularnewline
56 &  0.9581 &  0.08387 &  0.04193 \tabularnewline
57 &  0.9794 &  0.04114 &  0.02057 \tabularnewline
58 &  0.9756 &  0.04879 &  0.0244 \tabularnewline
59 &  0.9699 &  0.0601 &  0.03005 \tabularnewline
60 &  0.9618 &  0.07649 &  0.03824 \tabularnewline
61 &  0.9643 &  0.07135 &  0.03568 \tabularnewline
62 &  0.9555 &  0.089 &  0.0445 \tabularnewline
63 &  0.9677 &  0.06463 &  0.03232 \tabularnewline
64 &  0.9625 &  0.07501 &  0.0375 \tabularnewline
65 &  0.9814 &  0.03715 &  0.01858 \tabularnewline
66 &  0.975 &  0.05008 &  0.02504 \tabularnewline
67 &  0.9674 &  0.06514 &  0.03257 \tabularnewline
68 &  0.9757 &  0.04865 &  0.02433 \tabularnewline
69 &  0.9702 &  0.05953 &  0.02976 \tabularnewline
70 &  0.9674 &  0.06529 &  0.03264 \tabularnewline
71 &  0.9582 &  0.08359 &  0.04179 \tabularnewline
72 &  0.9634 &  0.0731 &  0.03655 \tabularnewline
73 &  0.9578 &  0.08447 &  0.04224 \tabularnewline
74 &  0.948 &  0.104 &  0.05199 \tabularnewline
75 &  0.9349 &  0.1301 &  0.06506 \tabularnewline
76 &  0.9374 &  0.1252 &  0.06262 \tabularnewline
77 &  0.9265 &  0.1471 &  0.07353 \tabularnewline
78 &  0.9494 &  0.1012 &  0.05062 \tabularnewline
79 &  0.9423 &  0.1154 &  0.05769 \tabularnewline
80 &  0.9698 &  0.06043 &  0.03022 \tabularnewline
81 &  0.9622 &  0.07565 &  0.03783 \tabularnewline
82 &  0.9544 &  0.09121 &  0.0456 \tabularnewline
83 &  0.9619 &  0.07623 &  0.03812 \tabularnewline
84 &  0.9491 &  0.1017 &  0.05086 \tabularnewline
85 &  0.9353 &  0.1295 &  0.06473 \tabularnewline
86 &  0.913 &  0.1741 &  0.08703 \tabularnewline
87 &  0.9111 &  0.1777 &  0.08886 \tabularnewline
88 &  0.8959 &  0.2083 &  0.1041 \tabularnewline
89 &  0.9207 &  0.1585 &  0.07926 \tabularnewline
90 &  0.9154 &  0.1693 &  0.08464 \tabularnewline
91 &  0.8872 &  0.2256 &  0.1128 \tabularnewline
92 &  0.852 &  0.2959 &  0.148 \tabularnewline
93 &  0.8202 &  0.3597 &  0.1798 \tabularnewline
94 &  0.8018 &  0.3964 &  0.1982 \tabularnewline
95 &  0.7654 &  0.4692 &  0.2346 \tabularnewline
96 &  0.7895 &  0.421 &  0.2105 \tabularnewline
97 &  0.7968 &  0.4064 &  0.2032 \tabularnewline
98 &  0.7484 &  0.5033 &  0.2516 \tabularnewline
99 &  0.7622 &  0.4757 &  0.2378 \tabularnewline
100 &  0.6948 &  0.6105 &  0.3052 \tabularnewline
101 &  0.6186 &  0.7628 &  0.3814 \tabularnewline
102 &  0.5361 &  0.9279 &  0.4639 \tabularnewline
103 &  0.4694 &  0.9387 &  0.5306 \tabularnewline
104 &  0.3845 &  0.7689 &  0.6155 \tabularnewline
105 &  0.2994 &  0.5988 &  0.7006 \tabularnewline
106 &  0.466 &  0.932 &  0.534 \tabularnewline
107 &  0.3765 &  0.7531 &  0.6235 \tabularnewline
108 &  0.3257 &  0.6513 &  0.6743 \tabularnewline
109 &  0.2634 &  0.5267 &  0.7366 \tabularnewline
110 &  0.4725 &  0.9449 &  0.5275 \tabularnewline
111 &  0.7653 &  0.4694 &  0.2347 \tabularnewline
112 &  0.629 &  0.742 &  0.371 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315683&T=6

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]20[/C][C] 0.2829[/C][C] 0.5658[/C][C] 0.7171[/C][/ROW]
[ROW][C]21[/C][C] 0.3154[/C][C] 0.6307[/C][C] 0.6846[/C][/ROW]
[ROW][C]22[/C][C] 0.5372[/C][C] 0.9255[/C][C] 0.4628[/C][/ROW]
[ROW][C]23[/C][C] 0.9728[/C][C] 0.05442[/C][C] 0.02721[/C][/ROW]
[ROW][C]24[/C][C] 0.955[/C][C] 0.09003[/C][C] 0.04501[/C][/ROW]
[ROW][C]25[/C][C] 0.9349[/C][C] 0.1302[/C][C] 0.06508[/C][/ROW]
[ROW][C]26[/C][C] 0.8993[/C][C] 0.2015[/C][C] 0.1007[/C][/ROW]
[ROW][C]27[/C][C] 0.8676[/C][C] 0.2648[/C][C] 0.1324[/C][/ROW]
[ROW][C]28[/C][C] 0.8207[/C][C] 0.3587[/C][C] 0.1793[/C][/ROW]
[ROW][C]29[/C][C] 0.8327[/C][C] 0.3345[/C][C] 0.1673[/C][/ROW]
[ROW][C]30[/C][C] 0.7781[/C][C] 0.4437[/C][C] 0.2219[/C][/ROW]
[ROW][C]31[/C][C] 0.7265[/C][C] 0.547[/C][C] 0.2735[/C][/ROW]
[ROW][C]32[/C][C] 0.7727[/C][C] 0.4546[/C][C] 0.2273[/C][/ROW]
[ROW][C]33[/C][C] 0.7772[/C][C] 0.4457[/C][C] 0.2228[/C][/ROW]
[ROW][C]34[/C][C] 0.9797[/C][C] 0.04055[/C][C] 0.02028[/C][/ROW]
[ROW][C]35[/C][C] 0.9755[/C][C] 0.04904[/C][C] 0.02452[/C][/ROW]
[ROW][C]36[/C][C] 0.9816[/C][C] 0.03685[/C][C] 0.01842[/C][/ROW]
[ROW][C]37[/C][C] 0.9842[/C][C] 0.0316[/C][C] 0.0158[/C][/ROW]
[ROW][C]38[/C][C] 0.9872[/C][C] 0.02567[/C][C] 0.01283[/C][/ROW]
[ROW][C]39[/C][C] 0.9823[/C][C] 0.03534[/C][C] 0.01767[/C][/ROW]
[ROW][C]40[/C][C] 0.9755[/C][C] 0.04899[/C][C] 0.02449[/C][/ROW]
[ROW][C]41[/C][C] 0.9756[/C][C] 0.04873[/C][C] 0.02436[/C][/ROW]
[ROW][C]42[/C][C] 0.9655[/C][C] 0.06902[/C][C] 0.03451[/C][/ROW]
[ROW][C]43[/C][C] 0.9718[/C][C] 0.05648[/C][C] 0.02824[/C][/ROW]
[ROW][C]44[/C][C] 0.9918[/C][C] 0.01633[/C][C] 0.008164[/C][/ROW]
[ROW][C]45[/C][C] 0.9881[/C][C] 0.02376[/C][C] 0.01188[/C][/ROW]
[ROW][C]46[/C][C] 0.9846[/C][C] 0.03075[/C][C] 0.01538[/C][/ROW]
[ROW][C]47[/C][C] 0.9862[/C][C] 0.02769[/C][C] 0.01384[/C][/ROW]
[ROW][C]48[/C][C] 0.9815[/C][C] 0.03693[/C][C] 0.01847[/C][/ROW]
[ROW][C]49[/C][C] 0.9746[/C][C] 0.05077[/C][C] 0.02539[/C][/ROW]
[ROW][C]50[/C][C] 0.9695[/C][C] 0.06108[/C][C] 0.03054[/C][/ROW]
[ROW][C]51[/C][C] 0.9621[/C][C] 0.07588[/C][C] 0.03794[/C][/ROW]
[ROW][C]52[/C][C] 0.9587[/C][C] 0.0826[/C][C] 0.0413[/C][/ROW]
[ROW][C]53[/C][C] 0.9583[/C][C] 0.08347[/C][C] 0.04174[/C][/ROW]
[ROW][C]54[/C][C] 0.9612[/C][C] 0.0775[/C][C] 0.03875[/C][/ROW]
[ROW][C]55[/C][C] 0.9491[/C][C] 0.1018[/C][C] 0.05088[/C][/ROW]
[ROW][C]56[/C][C] 0.9581[/C][C] 0.08387[/C][C] 0.04193[/C][/ROW]
[ROW][C]57[/C][C] 0.9794[/C][C] 0.04114[/C][C] 0.02057[/C][/ROW]
[ROW][C]58[/C][C] 0.9756[/C][C] 0.04879[/C][C] 0.0244[/C][/ROW]
[ROW][C]59[/C][C] 0.9699[/C][C] 0.0601[/C][C] 0.03005[/C][/ROW]
[ROW][C]60[/C][C] 0.9618[/C][C] 0.07649[/C][C] 0.03824[/C][/ROW]
[ROW][C]61[/C][C] 0.9643[/C][C] 0.07135[/C][C] 0.03568[/C][/ROW]
[ROW][C]62[/C][C] 0.9555[/C][C] 0.089[/C][C] 0.0445[/C][/ROW]
[ROW][C]63[/C][C] 0.9677[/C][C] 0.06463[/C][C] 0.03232[/C][/ROW]
[ROW][C]64[/C][C] 0.9625[/C][C] 0.07501[/C][C] 0.0375[/C][/ROW]
[ROW][C]65[/C][C] 0.9814[/C][C] 0.03715[/C][C] 0.01858[/C][/ROW]
[ROW][C]66[/C][C] 0.975[/C][C] 0.05008[/C][C] 0.02504[/C][/ROW]
[ROW][C]67[/C][C] 0.9674[/C][C] 0.06514[/C][C] 0.03257[/C][/ROW]
[ROW][C]68[/C][C] 0.9757[/C][C] 0.04865[/C][C] 0.02433[/C][/ROW]
[ROW][C]69[/C][C] 0.9702[/C][C] 0.05953[/C][C] 0.02976[/C][/ROW]
[ROW][C]70[/C][C] 0.9674[/C][C] 0.06529[/C][C] 0.03264[/C][/ROW]
[ROW][C]71[/C][C] 0.9582[/C][C] 0.08359[/C][C] 0.04179[/C][/ROW]
[ROW][C]72[/C][C] 0.9634[/C][C] 0.0731[/C][C] 0.03655[/C][/ROW]
[ROW][C]73[/C][C] 0.9578[/C][C] 0.08447[/C][C] 0.04224[/C][/ROW]
[ROW][C]74[/C][C] 0.948[/C][C] 0.104[/C][C] 0.05199[/C][/ROW]
[ROW][C]75[/C][C] 0.9349[/C][C] 0.1301[/C][C] 0.06506[/C][/ROW]
[ROW][C]76[/C][C] 0.9374[/C][C] 0.1252[/C][C] 0.06262[/C][/ROW]
[ROW][C]77[/C][C] 0.9265[/C][C] 0.1471[/C][C] 0.07353[/C][/ROW]
[ROW][C]78[/C][C] 0.9494[/C][C] 0.1012[/C][C] 0.05062[/C][/ROW]
[ROW][C]79[/C][C] 0.9423[/C][C] 0.1154[/C][C] 0.05769[/C][/ROW]
[ROW][C]80[/C][C] 0.9698[/C][C] 0.06043[/C][C] 0.03022[/C][/ROW]
[ROW][C]81[/C][C] 0.9622[/C][C] 0.07565[/C][C] 0.03783[/C][/ROW]
[ROW][C]82[/C][C] 0.9544[/C][C] 0.09121[/C][C] 0.0456[/C][/ROW]
[ROW][C]83[/C][C] 0.9619[/C][C] 0.07623[/C][C] 0.03812[/C][/ROW]
[ROW][C]84[/C][C] 0.9491[/C][C] 0.1017[/C][C] 0.05086[/C][/ROW]
[ROW][C]85[/C][C] 0.9353[/C][C] 0.1295[/C][C] 0.06473[/C][/ROW]
[ROW][C]86[/C][C] 0.913[/C][C] 0.1741[/C][C] 0.08703[/C][/ROW]
[ROW][C]87[/C][C] 0.9111[/C][C] 0.1777[/C][C] 0.08886[/C][/ROW]
[ROW][C]88[/C][C] 0.8959[/C][C] 0.2083[/C][C] 0.1041[/C][/ROW]
[ROW][C]89[/C][C] 0.9207[/C][C] 0.1585[/C][C] 0.07926[/C][/ROW]
[ROW][C]90[/C][C] 0.9154[/C][C] 0.1693[/C][C] 0.08464[/C][/ROW]
[ROW][C]91[/C][C] 0.8872[/C][C] 0.2256[/C][C] 0.1128[/C][/ROW]
[ROW][C]92[/C][C] 0.852[/C][C] 0.2959[/C][C] 0.148[/C][/ROW]
[ROW][C]93[/C][C] 0.8202[/C][C] 0.3597[/C][C] 0.1798[/C][/ROW]
[ROW][C]94[/C][C] 0.8018[/C][C] 0.3964[/C][C] 0.1982[/C][/ROW]
[ROW][C]95[/C][C] 0.7654[/C][C] 0.4692[/C][C] 0.2346[/C][/ROW]
[ROW][C]96[/C][C] 0.7895[/C][C] 0.421[/C][C] 0.2105[/C][/ROW]
[ROW][C]97[/C][C] 0.7968[/C][C] 0.4064[/C][C] 0.2032[/C][/ROW]
[ROW][C]98[/C][C] 0.7484[/C][C] 0.5033[/C][C] 0.2516[/C][/ROW]
[ROW][C]99[/C][C] 0.7622[/C][C] 0.4757[/C][C] 0.2378[/C][/ROW]
[ROW][C]100[/C][C] 0.6948[/C][C] 0.6105[/C][C] 0.3052[/C][/ROW]
[ROW][C]101[/C][C] 0.6186[/C][C] 0.7628[/C][C] 0.3814[/C][/ROW]
[ROW][C]102[/C][C] 0.5361[/C][C] 0.9279[/C][C] 0.4639[/C][/ROW]
[ROW][C]103[/C][C] 0.4694[/C][C] 0.9387[/C][C] 0.5306[/C][/ROW]
[ROW][C]104[/C][C] 0.3845[/C][C] 0.7689[/C][C] 0.6155[/C][/ROW]
[ROW][C]105[/C][C] 0.2994[/C][C] 0.5988[/C][C] 0.7006[/C][/ROW]
[ROW][C]106[/C][C] 0.466[/C][C] 0.932[/C][C] 0.534[/C][/ROW]
[ROW][C]107[/C][C] 0.3765[/C][C] 0.7531[/C][C] 0.6235[/C][/ROW]
[ROW][C]108[/C][C] 0.3257[/C][C] 0.6513[/C][C] 0.6743[/C][/ROW]
[ROW][C]109[/C][C] 0.2634[/C][C] 0.5267[/C][C] 0.7366[/C][/ROW]
[ROW][C]110[/C][C] 0.4725[/C][C] 0.9449[/C][C] 0.5275[/C][/ROW]
[ROW][C]111[/C][C] 0.7653[/C][C] 0.4694[/C][C] 0.2347[/C][/ROW]
[ROW][C]112[/C][C] 0.629[/C][C] 0.742[/C][C] 0.371[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315683&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
20 0.2829 0.5658 0.7171
21 0.3154 0.6307 0.6846
22 0.5372 0.9255 0.4628
23 0.9728 0.05442 0.02721
24 0.955 0.09003 0.04501
25 0.9349 0.1302 0.06508
26 0.8993 0.2015 0.1007
27 0.8676 0.2648 0.1324
28 0.8207 0.3587 0.1793
29 0.8327 0.3345 0.1673
30 0.7781 0.4437 0.2219
31 0.7265 0.547 0.2735
32 0.7727 0.4546 0.2273
33 0.7772 0.4457 0.2228
34 0.9797 0.04055 0.02028
35 0.9755 0.04904 0.02452
36 0.9816 0.03685 0.01842
37 0.9842 0.0316 0.0158
38 0.9872 0.02567 0.01283
39 0.9823 0.03534 0.01767
40 0.9755 0.04899 0.02449
41 0.9756 0.04873 0.02436
42 0.9655 0.06902 0.03451
43 0.9718 0.05648 0.02824
44 0.9918 0.01633 0.008164
45 0.9881 0.02376 0.01188
46 0.9846 0.03075 0.01538
47 0.9862 0.02769 0.01384
48 0.9815 0.03693 0.01847
49 0.9746 0.05077 0.02539
50 0.9695 0.06108 0.03054
51 0.9621 0.07588 0.03794
52 0.9587 0.0826 0.0413
53 0.9583 0.08347 0.04174
54 0.9612 0.0775 0.03875
55 0.9491 0.1018 0.05088
56 0.9581 0.08387 0.04193
57 0.9794 0.04114 0.02057
58 0.9756 0.04879 0.0244
59 0.9699 0.0601 0.03005
60 0.9618 0.07649 0.03824
61 0.9643 0.07135 0.03568
62 0.9555 0.089 0.0445
63 0.9677 0.06463 0.03232
64 0.9625 0.07501 0.0375
65 0.9814 0.03715 0.01858
66 0.975 0.05008 0.02504
67 0.9674 0.06514 0.03257
68 0.9757 0.04865 0.02433
69 0.9702 0.05953 0.02976
70 0.9674 0.06529 0.03264
71 0.9582 0.08359 0.04179
72 0.9634 0.0731 0.03655
73 0.9578 0.08447 0.04224
74 0.948 0.104 0.05199
75 0.9349 0.1301 0.06506
76 0.9374 0.1252 0.06262
77 0.9265 0.1471 0.07353
78 0.9494 0.1012 0.05062
79 0.9423 0.1154 0.05769
80 0.9698 0.06043 0.03022
81 0.9622 0.07565 0.03783
82 0.9544 0.09121 0.0456
83 0.9619 0.07623 0.03812
84 0.9491 0.1017 0.05086
85 0.9353 0.1295 0.06473
86 0.913 0.1741 0.08703
87 0.9111 0.1777 0.08886
88 0.8959 0.2083 0.1041
89 0.9207 0.1585 0.07926
90 0.9154 0.1693 0.08464
91 0.8872 0.2256 0.1128
92 0.852 0.2959 0.148
93 0.8202 0.3597 0.1798
94 0.8018 0.3964 0.1982
95 0.7654 0.4692 0.2346
96 0.7895 0.421 0.2105
97 0.7968 0.4064 0.2032
98 0.7484 0.5033 0.2516
99 0.7622 0.4757 0.2378
100 0.6948 0.6105 0.3052
101 0.6186 0.7628 0.3814
102 0.5361 0.9279 0.4639
103 0.4694 0.9387 0.5306
104 0.3845 0.7689 0.6155
105 0.2994 0.5988 0.7006
106 0.466 0.932 0.534
107 0.3765 0.7531 0.6235
108 0.3257 0.6513 0.6743
109 0.2634 0.5267 0.7366
110 0.4725 0.9449 0.5275
111 0.7653 0.4694 0.2347
112 0.629 0.742 0.371







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level170.182796NOK
10% type I error level450.483871NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 17 & 0.182796 & NOK \tabularnewline
10% type I error level & 45 & 0.483871 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315683&T=7

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]17[/C][C]0.182796[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]45[/C][C]0.483871[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315683&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level170.182796NOK
10% type I error level450.483871NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.6116, df1 = 2, df2 = 113, p-value = 0.2041
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.71919, df1 = 32, df2 = 83, p-value = 0.8515
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.044604, df1 = 2, df2 = 113, p-value = 0.9564

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.6116, df1 = 2, df2 = 113, p-value = 0.2041
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.71919, df1 = 32, df2 = 83, p-value = 0.8515
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.044604, df1 = 2, df2 = 113, p-value = 0.9564
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=315683&T=8

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.6116, df1 = 2, df2 = 113, p-value = 0.2041
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.71919, df1 = 32, df2 = 83, p-value = 0.8515
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.044604, df1 = 2, df2 = 113, p-value = 0.9564
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=315683&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.6116, df1 = 2, df2 = 113, p-value = 0.2041
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.71919, df1 = 32, df2 = 83, p-value = 0.8515
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.044604, df1 = 2, df2 = 113, p-value = 0.9564







Variance Inflation Factors (Multicollinearity)
> vif
     IK1      IK2      IK3      IK4    KVDD1    KVDD2    KVDD3    KVDD4 
1.615338 1.507243 1.663850 1.421215 1.405451 1.222929 1.240410 1.221858 
     EP1      EP2      EP3      EP4      EC1      EC2      EC3      EC4 
2.641173 2.471580 1.094537 1.461922 1.802779 1.571429 1.345985 1.260384 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
     IK1      IK2      IK3      IK4    KVDD1    KVDD2    KVDD3    KVDD4 
1.615338 1.507243 1.663850 1.421215 1.405451 1.222929 1.240410 1.221858 
     EP1      EP2      EP3      EP4      EC1      EC2      EC3      EC4 
2.641173 2.471580 1.094537 1.461922 1.802779 1.571429 1.345985 1.260384 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=315683&T=9

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
     IK1      IK2      IK3      IK4    KVDD1    KVDD2    KVDD3    KVDD4 
1.615338 1.507243 1.663850 1.421215 1.405451 1.222929 1.240410 1.221858 
     EP1      EP2      EP3      EP4      EC1      EC2      EC3      EC4 
2.641173 2.471580 1.094537 1.461922 1.802779 1.571429 1.345985 1.260384 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=315683&T=9

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315683&T=9

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
     IK1      IK2      IK3      IK4    KVDD1    KVDD2    KVDD3    KVDD4 
1.615338 1.507243 1.663850 1.421215 1.405451 1.222929 1.240410 1.221858 
     EP1      EP2      EP3      EP4      EC1      EC2      EC3      EC4 
2.641173 2.471580 1.094537 1.461922 1.802779 1.571429 1.345985 1.260384 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par6 = 12 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ; par6 = 12 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par6 <- as.numeric(par6)
if(is.na(par6)) {
par6 <- 12
mywarning = 'Warning: you did not specify the seasonality. The seasonal period was set to s = 12.'
}
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (!is.numeric(par4)) par4 <- 0
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
if (!is.numeric(par5)) par5 <- 0
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s)'){
(n <- n - par6)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-Bs)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+par6,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - par6)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-Bs)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+par6,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*par6,par5), dimnames=list(1:(n-par5*par6), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*par6)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*par6-j*par6,par1]
}
}
x <- cbind(x[(par5*par6+1):n,], x2)
n <- n - par5*par6
}
if (par2 == 'Include Seasonal Dummies'){
x2 <- array(0, dim=c(n,par6-1), dimnames=list(1:n, paste('M', seq(1:(par6-1)), sep ='')))
for (i in 1:(par6-1)){
x2[seq(i,n,par6),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
a <-table.start()
a <- table.row.start(a)
a <- table.element(a,'Menu of Residual Diagnostics',2,TRUE)
a <- table.row.end(a)
a <- table.row.start(a)
a <- table.element(a,'Description',1,TRUE)
a <- table.element(a,'Link',1,TRUE)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Histogram',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_histogram.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_centraltendency.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'QQ Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_fitdistrnorm.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Kernel Density Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_density.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Skewness/Kurtosis Test',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_skewness_kurtosis.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Skewness-Kurtosis Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_skewness_kurtosis_plot.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Harrell-Davis Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_harrell_davis.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Bootstrap Plot -- Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_bootstrapplot1.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Blocked Bootstrap Plot -- Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_bootstrapplot.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'(Partial) Autocorrelation Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_autocorrelation.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Spectral Analysis',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_spectrum.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Tukey lambda PPCC Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_tukeylambda.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Box-Cox Normality Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_boxcoxnorm.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <- table.element(a,'Summary Statistics',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_summary1.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable7.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')