Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 27 Nov 2008 07:22:08 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Nov/27/t1227795952k9nloyr1nrxz95k.htm/, Retrieved Tue, 28 May 2024 05:19:52 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=25817, Retrieved Tue, 28 May 2024 05:19:52 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact203
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
F       [Multiple Regression] [Opdracht 1 - Blok...] [2008-11-27 14:22:08] [1351baa662f198be3bff32f9007a9a6d] [Current]
Feedback Forum
2008-12-01 15:59:49 [Nathalie Daneels] [reply
Evaluatie opdracht 1 - Blok 11 (Q3):

De conclusies zijn vrij volledig en correct, maar kunnen nog aangevuld worden soms:
* Bij de tabel 'multiple linear regression - estimated regression equation' en 'multiple linear regression - ordinary least squares':
- De reden waarom we december als referentiemaand hebben gekozen: Ook bij deze opdracht is de maand december de referentiemaand. De reden hiervoor is: De cijferreeks begint bij januari 2001 en de referentiemaand is de maand (van het jaar daarna) die net voor de eerste maand van de cijferreeks komt.
- De nul-hypothese (waarbij de parameters gelijk worden gesteld aan 0) stelt dat de aanslagen op 11 september 2001 geen invloed hebben gehad op de indicator van het consumentenvertrouwen, tenzij het tegendeel wordt bewezen.
- De p-waarden moeten telkens kleiner zijn dan 5% (type I error): Dit betekent dat de kans dat we ons vergissen kleiner is dan 5%.
- Bij de uitleg van de dummievariabele: D stelt dus de dummie voor: Door de aanval op de WTC-torens is de indicator van het consumentenvertrouwen gedaald met 2,21 (afgerond) (D = 1, dit betekent dat de aanslagen zich hebben voorgedaan, dat 11 september heeft plaatsgevonden ; Als D = 0 dan bevinden we ons voor of na deze datum en telt de parameter die overeenkomt met D niet mee, aangezien deze wegvalt door de waarde 0).
- Bij het bespreken van de maanden in vergelijking met de referentiemaand, moeten we gaan kijken naar de waarden in de kolom parameter, die met deze maanden overeenkomen. De student concludeerde oa correct dat er enkel in de maanden oktober en november een daling is van de indicator van het consumentenvertrouwen. Bij de bespreking hiervan wordt er nog geen rekening gehouden met de aanslagen op 11 september 2001 en de lange termijn trend; We gaan de seizoenaliteit onderzoeken.
- De nulhypothese (die stelt dat de datum 11 september 2001 geen invloed heeft op de indicator van het consumentenvertrouwen, tenzij het tegendeel wordt bewezen) mag niet verworpen worden, omdat de parameters niet significant verschillend zijn van nul. (Dit concludeerde de student ook, maar 'tenzij het tegendeel bewezen wordt' moet er nog bijstaan plus het volgende:) Dit wordt ook bevestigd als we de absolute waarde van de kritische waarde van de T-statistiek bekijken. Deze zijn allemaal < 2, wat wil zeggen dat we Ho niet mogen verwerpen en de seizoenaliteit te wijten was aan toeval.
* Bij de conclusie van de tabel 'multiple linear regression - regression statistics': Het gaat hier over de indicator van het consumententenvertrouwen (in de getypte zin: 'R-squared geeft het percentage aan dat we kunnen verklaren van de spreiding/variabiliteit van de indicator van het consumentenvetrouwen.')
Om na te gaan of het feit dat dit model 31% van de schommelingen verklaart, te wijten is aan het toeval, moeten we een hypothese opstellen, en nagaan of de verdeling van R-squared significant verschillend is van de Ho. Ho = R-squared = O en Ha = R-squared > 0. Vervolgens moeten we gaan kijken naar de p-value.
De residual Standaard deviation = 5,8 (afgerond). Dit duidt de spreiding van de voorspellingsfouten aan: De te verwachten fout die ik voorspel voor die residu’s. Als ik een voorspelling maak met dit model, kan ik voorspellen hoeveel slachtoffers er zijn. Bij deze voorspelling kan ik er 5,8 naast zitten (een afwijking van 5,8 naar boven of naar onder.)
Als we kijken naar de ‘Adjusted R-squared’ kunnen we besluiten dat dit geen goed beeld weergeeft van de realiteit. Aan de hand van dit model kunnen we slechts 12% van de schommelingen, die bestaan in de indicator van het consumentenvertrouwen, verklaren. Dit is aan toeval onderhevig, want de P-value is groter dan 5% type I error, wat wil zeggen dat onze alternatieve hypothese (>0) niet significant verschilt van onze Ho (=0).
* Bij de grafiek 'actuals and interpolation':
Deze herhaling van het patroon wijst erop dat we voorspellingen kunnen maken op basis van het verleden, wat op zijn beurt wijst op autocorrelatie. Dit is dus geen goed model, want er wordt niet aan alle assumpties voldaan.
Als we deze patronen op lange termijn bekijken, kunnen we vaststellen dat er globaal een dalende lange termijn trend is. Maar ook op korte termijn kunnen we telkens een daling vaststellen (zie de 5 rode dalende lijnen). Het patroon van de korte termijn wordt duidelijk gedurende de lange termijn dalende trend herhaald.
* Bij de grafiek 'residuals':
Residuals = werkelijke waarden – voorspelde waarden, wat de student ook correct vermeldde. Als deze uitkomt gelijk is aan 0, dan betekent dit dat we de waarden correct hebben voorspeld. (Deze zijn dan gelijk aan de werkelijke waarden). Als deze uitkomt groter is dan 0, dan liggen de werkelijke waarden hoger dan de voorspelde waarden. (De voorspelling was dus niet correct). Als deze uitkomt kleiner is dan 0, dan zijn de voorspelde waarden hoger dan de werkelijke waarden (DE voorspelling was dus niet correct). Deze grafiek geeft dus de voorspellingsfouten weer. Het gemiddelde van deze voorspellingsfouten moet gelijk zijn aan nul (Ook deze zin concludeerde de student correct).(Dit betekent dat de te hoog voorspelde waarden en de te laag voorspelde waarden elkaar neutraliseren) en dus ook constant zijn.
Het zou eventueel mogelijk kunnen zijn dat het gemiddelde van deze voorspellingsfouten toch nul is: Er liggen veel meer voorspellingsfouten boven de horizontale zwarte lijn dan eronder, maar de voorspellingsfouten eronder zijn veel negatiever dan de positieve positieve voorspellingsfouten boven de horizontale as. Het is mogelijk dat deze 2 elkaar neutraliseren.
We kunnen concluderen, net zoals de student dat correct deed, dat er geen patroon en dus ook geen autocorrelatie is in deze grafiek. De reden is de volgende: Stijgende residu’s worden niet echt gevolgd (of vooraf gegaan) door stijgende en dalende residu’s worden ook niet echt vooraf gegaan door dalende (of gevolgd door dalende residu’s).We kunnen dus eigenlijk geen voorspellingen doen (op basis van het verleden, aangezien dalende/stijgende residu’s niet vooraf worden gegaan door dalende/stijgende residu’s).
* Bij de grafiek 'Residual density plot':
Het feit dat het aantal positieve voorspellingsfouten niet overeenkomt met het aantal negatieve voorspellingsfouten, wijst erop dat bij de grafiek van de residu’s het gemiddelde van de voorspellingsfouten dus niet gelijk is aan 0.
* Bij de grafiek 'residual normal qq-plot':
Deze grafiek toont het verband aan tussen de steekproefkwantielen en de theoretische kwantielen en we kunnen uit deze grafiek eveneens afleiden of (het verband tussen) deze quantielen van de residu’s de normaalcurve (de diagonale rechte) benaderen of niet.
* Bij de grafiek 'Residual lag plot, lowess and regression line':
Een positieve correlatie betekent dat er voorspelbaarheid is op basis van het verleden. Dit is een indicator dat het model niet juist kan zijn.
* Bij de grafiek 'Residual autocorrelation function':
Een patroon in de autocorrelatie wijst erop dat er voorspellingen kunnen gemaakt worden op basis van het verleden. In dit geval zijn de meeste autocorrelaties echter te wijten aan het toeval (ze vallen binnen het 95% betrouwbaarheidsinterval). Dit betekent dat het patroon, wat dus ook wijst op autocorrelatie, te wijten is aan het toeval. Dit betekent dat we geen voorspellingen kunnen maken op basis van het verleden.

We kunnen dus besluiten:
Het model is nog niet helemaal correct/in orde: Om aan de assumpties te voldoen:
* Mag er geen patroon of autocorrelatie zijn. Dit is hier niet voldaan want uit de grafiek van de residuals en de autocorrelatie kunnen we vaststellen dat er een patroon is en dus ook autocorrelatie. Maar we moeten wel opmerken dat het grootste gedeelte van de autocorrelatie te wijten is aan het toeval.
* Moet het gemiddelde constant en nul zijn. Hier bestaat mogelijk twijfel over, maar ik ben van mening dat het gemiddelde niet echt nul gaat zijn. (Grafiek residuals en density plot).


Post a new message
Dataseries X:
13	0
8	0
7	0
3	0
3	0
4	0
4	0
0	0
-4	1
-14	0
-18	0
-8	0
-1	0
1	0
2	0
0	0
1	0
0	0
-1	0
-3	0
-3	0
-3	0
-4	0
-8	0
-9	0
-13	0
-18	0
-11	0
-9	0
-10	0
-13	0
-11	0
-5	0
-15	0
-6	0
-6	0
-3	0
-1	0
-3	0
-4	0
-6	0
0	0
-4	0
-2	0
-2	0
-6	0
-7	0
-6	0
-6	0
-3	0
-2	0
-5	0
-11	0
-11	0
-11	0
-10	0
-14	0
-8	0
-9	0
-5	0
-1	0




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'George Udny Yule' @ 72.249.76.132

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 4 seconds \tabularnewline
R Server & 'George Udny Yule' @ 72.249.76.132 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25817&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]4 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ 72.249.76.132[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25817&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25817&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'George Udny Yule' @ 72.249.76.132







Multiple Linear Regression - Estimated Regression Equation
Y[t] = -1.54285714285714 -2.21428571428576D[t] + 4.73095238095237M1[t] + 3.5952380952381M2[t] + 2.53571428571428M3[t] + 2.07619047619047M4[t] + 1.21666666666666M5[t] + 2.35714285714285M6[t] + 0.897619047619044M7[t] + 0.838095238095236M8[t] + 1.02142857142856M9[t] -2.88095238095238M10[t] -2.34047619047619M11[t] -0.140476190476191t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  -1.54285714285714 -2.21428571428576D[t] +  4.73095238095237M1[t] +  3.5952380952381M2[t] +  2.53571428571428M3[t] +  2.07619047619047M4[t] +  1.21666666666666M5[t] +  2.35714285714285M6[t] +  0.897619047619044M7[t] +  0.838095238095236M8[t] +  1.02142857142856M9[t] -2.88095238095238M10[t] -2.34047619047619M11[t] -0.140476190476191t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25817&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  -1.54285714285714 -2.21428571428576D[t] +  4.73095238095237M1[t] +  3.5952380952381M2[t] +  2.53571428571428M3[t] +  2.07619047619047M4[t] +  1.21666666666666M5[t] +  2.35714285714285M6[t] +  0.897619047619044M7[t] +  0.838095238095236M8[t] +  1.02142857142856M9[t] -2.88095238095238M10[t] -2.34047619047619M11[t] -0.140476190476191t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25817&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25817&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = -1.54285714285714 -2.21428571428576D[t] + 4.73095238095237M1[t] + 3.5952380952381M2[t] + 2.53571428571428M3[t] + 2.07619047619047M4[t] + 1.21666666666666M5[t] + 2.35714285714285M6[t] + 0.897619047619044M7[t] + 0.838095238095236M8[t] + 1.02142857142856M9[t] -2.88095238095238M10[t] -2.34047619047619M11[t] -0.140476190476191t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-1.542857142857143.037721-0.50790.6138990.30695
D-2.214285714285766.625758-0.33420.739720.36986
M14.730952380952373.5242521.34240.1859150.092958
M23.59523809523813.699810.97170.3361570.168078
M32.535714285714283.6948950.68630.4959110.247956
M42.076190476190473.6904920.56260.5763960.288198
M51.216666666666663.6866020.330.7428480.371424
M62.357142857142853.6832280.640.5253020.262651
M70.8976190476190443.6803710.24390.8083750.404187
M80.8380952380952363.6780310.22790.8207390.41037
M91.021428571428563.8989340.2620.7944840.397242
M10-2.880952380952383.674909-0.7840.4370010.218501
M11-2.340476190476193.674128-0.6370.5272060.263603
t-0.1404761904761910.043737-3.21190.0023820.001191

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & -1.54285714285714 & 3.037721 & -0.5079 & 0.613899 & 0.30695 \tabularnewline
D & -2.21428571428576 & 6.625758 & -0.3342 & 0.73972 & 0.36986 \tabularnewline
M1 & 4.73095238095237 & 3.524252 & 1.3424 & 0.185915 & 0.092958 \tabularnewline
M2 & 3.5952380952381 & 3.69981 & 0.9717 & 0.336157 & 0.168078 \tabularnewline
M3 & 2.53571428571428 & 3.694895 & 0.6863 & 0.495911 & 0.247956 \tabularnewline
M4 & 2.07619047619047 & 3.690492 & 0.5626 & 0.576396 & 0.288198 \tabularnewline
M5 & 1.21666666666666 & 3.686602 & 0.33 & 0.742848 & 0.371424 \tabularnewline
M6 & 2.35714285714285 & 3.683228 & 0.64 & 0.525302 & 0.262651 \tabularnewline
M7 & 0.897619047619044 & 3.680371 & 0.2439 & 0.808375 & 0.404187 \tabularnewline
M8 & 0.838095238095236 & 3.678031 & 0.2279 & 0.820739 & 0.41037 \tabularnewline
M9 & 1.02142857142856 & 3.898934 & 0.262 & 0.794484 & 0.397242 \tabularnewline
M10 & -2.88095238095238 & 3.674909 & -0.784 & 0.437001 & 0.218501 \tabularnewline
M11 & -2.34047619047619 & 3.674128 & -0.637 & 0.527206 & 0.263603 \tabularnewline
t & -0.140476190476191 & 0.043737 & -3.2119 & 0.002382 & 0.001191 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25817&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]-1.54285714285714[/C][C]3.037721[/C][C]-0.5079[/C][C]0.613899[/C][C]0.30695[/C][/ROW]
[ROW][C]D[/C][C]-2.21428571428576[/C][C]6.625758[/C][C]-0.3342[/C][C]0.73972[/C][C]0.36986[/C][/ROW]
[ROW][C]M1[/C][C]4.73095238095237[/C][C]3.524252[/C][C]1.3424[/C][C]0.185915[/C][C]0.092958[/C][/ROW]
[ROW][C]M2[/C][C]3.5952380952381[/C][C]3.69981[/C][C]0.9717[/C][C]0.336157[/C][C]0.168078[/C][/ROW]
[ROW][C]M3[/C][C]2.53571428571428[/C][C]3.694895[/C][C]0.6863[/C][C]0.495911[/C][C]0.247956[/C][/ROW]
[ROW][C]M4[/C][C]2.07619047619047[/C][C]3.690492[/C][C]0.5626[/C][C]0.576396[/C][C]0.288198[/C][/ROW]
[ROW][C]M5[/C][C]1.21666666666666[/C][C]3.686602[/C][C]0.33[/C][C]0.742848[/C][C]0.371424[/C][/ROW]
[ROW][C]M6[/C][C]2.35714285714285[/C][C]3.683228[/C][C]0.64[/C][C]0.525302[/C][C]0.262651[/C][/ROW]
[ROW][C]M7[/C][C]0.897619047619044[/C][C]3.680371[/C][C]0.2439[/C][C]0.808375[/C][C]0.404187[/C][/ROW]
[ROW][C]M8[/C][C]0.838095238095236[/C][C]3.678031[/C][C]0.2279[/C][C]0.820739[/C][C]0.41037[/C][/ROW]
[ROW][C]M9[/C][C]1.02142857142856[/C][C]3.898934[/C][C]0.262[/C][C]0.794484[/C][C]0.397242[/C][/ROW]
[ROW][C]M10[/C][C]-2.88095238095238[/C][C]3.674909[/C][C]-0.784[/C][C]0.437001[/C][C]0.218501[/C][/ROW]
[ROW][C]M11[/C][C]-2.34047619047619[/C][C]3.674128[/C][C]-0.637[/C][C]0.527206[/C][C]0.263603[/C][/ROW]
[ROW][C]t[/C][C]-0.140476190476191[/C][C]0.043737[/C][C]-3.2119[/C][C]0.002382[/C][C]0.001191[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25817&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25817&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-1.542857142857143.037721-0.50790.6138990.30695
D-2.214285714285766.625758-0.33420.739720.36986
M14.730952380952373.5242521.34240.1859150.092958
M23.59523809523813.699810.97170.3361570.168078
M32.535714285714283.6948950.68630.4959110.247956
M42.076190476190473.6904920.56260.5763960.288198
M51.216666666666663.6866020.330.7428480.371424
M62.357142857142853.6832280.640.5253020.262651
M70.8976190476190443.6803710.24390.8083750.404187
M80.8380952380952363.6780310.22790.8207390.41037
M91.021428571428563.8989340.2620.7944840.397242
M10-2.880952380952383.674909-0.7840.4370010.218501
M11-2.340476190476193.674128-0.6370.5272060.263603
t-0.1404761904761910.043737-3.21190.0023820.001191







Multiple Linear Regression - Regression Statistics
Multiple R0.558450105749248
R-squared0.311866520611346
Adjusted R-squared0.121531728440016
F-TEST (value)1.63851557066151
F-TEST (DF numerator)13
F-TEST (DF denominator)47
p-value0.10811902855139
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.80889511106437
Sum Squared Residuals1585.93333333333

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.558450105749248 \tabularnewline
R-squared & 0.311866520611346 \tabularnewline
Adjusted R-squared & 0.121531728440016 \tabularnewline
F-TEST (value) & 1.63851557066151 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 47 \tabularnewline
p-value & 0.10811902855139 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 5.80889511106437 \tabularnewline
Sum Squared Residuals & 1585.93333333333 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25817&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.558450105749248[/C][/ROW]
[ROW][C]R-squared[/C][C]0.311866520611346[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.121531728440016[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]1.63851557066151[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]47[/C][/ROW]
[ROW][C]p-value[/C][C]0.10811902855139[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]5.80889511106437[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]1585.93333333333[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25817&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25817&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.558450105749248
R-squared0.311866520611346
Adjusted R-squared0.121531728440016
F-TEST (value)1.63851557066151
F-TEST (DF numerator)13
F-TEST (DF denominator)47
p-value0.10811902855139
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.80889511106437
Sum Squared Residuals1585.93333333333







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1133.047619047619079.95238095238093
281.771428571428526.22857142857148
370.5714285714285786.42857142857142
43-0.02857142857143563.02857142857144
53-1.028571428571424.02857142857142
64-0.02857142857142674.02857142857143
74-1.628571428571435.62857142857143
80-1.828571428571431.82857142857143
9-4-3.99999999999998-1.90958360235527e-14
10-14-5.82857142857143-8.17142857142857
11-18-5.42857142857144-12.5714285714286
12-8-3.22857142857142-4.77142857142858
13-11.36190476190476-2.36190476190476
1410.08571428571429380.914285714285706
152-1.114285714285723.11428571428572
160-1.714285714285711.71428571428571
171-2.714285714285713.71428571428571
180-1.714285714285711.71428571428571
19-1-3.314285714285712.31428571428571
20-3-3.514285714285710.514285714285712
21-3-3.471428571428580.471428571428576
22-3-7.514285714285714.51428571428571
23-4-7.114285714285713.11428571428571
24-8-4.91428571428571-3.08571428571429
25-9-0.323809523809526-8.67619047619047
26-13-1.59999999999999-11.4000000000000
27-18-2.80000000000001-15.2
28-11-3.4-7.6
29-9-4.4-4.6
30-10-3.4-6.6
31-13-5-8
32-11-5.2-5.8
33-5-5.157142857142860.157142857142863
34-15-9.2-5.8
35-6-8.82.80000000000000
36-6-6.60.599999999999997
37-3-2.00952380952381-0.990476190476187
38-1-3.285714285714272.28571428571427
39-3-4.485714285714291.48571428571429
40-4-5.085714285714281.08571428571428
41-6-6.085714285714280.0857142857142834
420-5.085714285714295.08571428571429
43-4-6.685714285714292.68571428571429
44-2-6.885714285714284.88571428571428
45-2-6.842857142857154.84285714285715
46-6-10.88571428571434.88571428571429
47-7-10.48571428571433.48571428571429
48-6-8.285714285714282.28571428571428
49-6-3.6952380952381-2.3047619047619
50-3-4.971428571428561.97142857142856
51-2-6.171428571428584.17142857142858
52-5-6.771428571428571.77142857142857
53-11-7.77142857142857-3.22857142857143
54-11-6.77142857142857-4.22857142857143
55-11-8.37142857142857-2.62857142857143
56-10-8.57142857142857-1.42857142857143
57-14-8.52857142857143-5.47142857142857
58-8-12.57142857142864.57142857142857
59-9-12.17142857142863.17142857142857
60-5-9.971428571428574.97142857142857
61-1-5.380952380952394.38095238095239

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 13 & 3.04761904761907 & 9.95238095238093 \tabularnewline
2 & 8 & 1.77142857142852 & 6.22857142857148 \tabularnewline
3 & 7 & 0.571428571428578 & 6.42857142857142 \tabularnewline
4 & 3 & -0.0285714285714356 & 3.02857142857144 \tabularnewline
5 & 3 & -1.02857142857142 & 4.02857142857142 \tabularnewline
6 & 4 & -0.0285714285714267 & 4.02857142857143 \tabularnewline
7 & 4 & -1.62857142857143 & 5.62857142857143 \tabularnewline
8 & 0 & -1.82857142857143 & 1.82857142857143 \tabularnewline
9 & -4 & -3.99999999999998 & -1.90958360235527e-14 \tabularnewline
10 & -14 & -5.82857142857143 & -8.17142857142857 \tabularnewline
11 & -18 & -5.42857142857144 & -12.5714285714286 \tabularnewline
12 & -8 & -3.22857142857142 & -4.77142857142858 \tabularnewline
13 & -1 & 1.36190476190476 & -2.36190476190476 \tabularnewline
14 & 1 & 0.0857142857142938 & 0.914285714285706 \tabularnewline
15 & 2 & -1.11428571428572 & 3.11428571428572 \tabularnewline
16 & 0 & -1.71428571428571 & 1.71428571428571 \tabularnewline
17 & 1 & -2.71428571428571 & 3.71428571428571 \tabularnewline
18 & 0 & -1.71428571428571 & 1.71428571428571 \tabularnewline
19 & -1 & -3.31428571428571 & 2.31428571428571 \tabularnewline
20 & -3 & -3.51428571428571 & 0.514285714285712 \tabularnewline
21 & -3 & -3.47142857142858 & 0.471428571428576 \tabularnewline
22 & -3 & -7.51428571428571 & 4.51428571428571 \tabularnewline
23 & -4 & -7.11428571428571 & 3.11428571428571 \tabularnewline
24 & -8 & -4.91428571428571 & -3.08571428571429 \tabularnewline
25 & -9 & -0.323809523809526 & -8.67619047619047 \tabularnewline
26 & -13 & -1.59999999999999 & -11.4000000000000 \tabularnewline
27 & -18 & -2.80000000000001 & -15.2 \tabularnewline
28 & -11 & -3.4 & -7.6 \tabularnewline
29 & -9 & -4.4 & -4.6 \tabularnewline
30 & -10 & -3.4 & -6.6 \tabularnewline
31 & -13 & -5 & -8 \tabularnewline
32 & -11 & -5.2 & -5.8 \tabularnewline
33 & -5 & -5.15714285714286 & 0.157142857142863 \tabularnewline
34 & -15 & -9.2 & -5.8 \tabularnewline
35 & -6 & -8.8 & 2.80000000000000 \tabularnewline
36 & -6 & -6.6 & 0.599999999999997 \tabularnewline
37 & -3 & -2.00952380952381 & -0.990476190476187 \tabularnewline
38 & -1 & -3.28571428571427 & 2.28571428571427 \tabularnewline
39 & -3 & -4.48571428571429 & 1.48571428571429 \tabularnewline
40 & -4 & -5.08571428571428 & 1.08571428571428 \tabularnewline
41 & -6 & -6.08571428571428 & 0.0857142857142834 \tabularnewline
42 & 0 & -5.08571428571429 & 5.08571428571429 \tabularnewline
43 & -4 & -6.68571428571429 & 2.68571428571429 \tabularnewline
44 & -2 & -6.88571428571428 & 4.88571428571428 \tabularnewline
45 & -2 & -6.84285714285715 & 4.84285714285715 \tabularnewline
46 & -6 & -10.8857142857143 & 4.88571428571429 \tabularnewline
47 & -7 & -10.4857142857143 & 3.48571428571429 \tabularnewline
48 & -6 & -8.28571428571428 & 2.28571428571428 \tabularnewline
49 & -6 & -3.6952380952381 & -2.3047619047619 \tabularnewline
50 & -3 & -4.97142857142856 & 1.97142857142856 \tabularnewline
51 & -2 & -6.17142857142858 & 4.17142857142858 \tabularnewline
52 & -5 & -6.77142857142857 & 1.77142857142857 \tabularnewline
53 & -11 & -7.77142857142857 & -3.22857142857143 \tabularnewline
54 & -11 & -6.77142857142857 & -4.22857142857143 \tabularnewline
55 & -11 & -8.37142857142857 & -2.62857142857143 \tabularnewline
56 & -10 & -8.57142857142857 & -1.42857142857143 \tabularnewline
57 & -14 & -8.52857142857143 & -5.47142857142857 \tabularnewline
58 & -8 & -12.5714285714286 & 4.57142857142857 \tabularnewline
59 & -9 & -12.1714285714286 & 3.17142857142857 \tabularnewline
60 & -5 & -9.97142857142857 & 4.97142857142857 \tabularnewline
61 & -1 & -5.38095238095239 & 4.38095238095239 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25817&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]13[/C][C]3.04761904761907[/C][C]9.95238095238093[/C][/ROW]
[ROW][C]2[/C][C]8[/C][C]1.77142857142852[/C][C]6.22857142857148[/C][/ROW]
[ROW][C]3[/C][C]7[/C][C]0.571428571428578[/C][C]6.42857142857142[/C][/ROW]
[ROW][C]4[/C][C]3[/C][C]-0.0285714285714356[/C][C]3.02857142857144[/C][/ROW]
[ROW][C]5[/C][C]3[/C][C]-1.02857142857142[/C][C]4.02857142857142[/C][/ROW]
[ROW][C]6[/C][C]4[/C][C]-0.0285714285714267[/C][C]4.02857142857143[/C][/ROW]
[ROW][C]7[/C][C]4[/C][C]-1.62857142857143[/C][C]5.62857142857143[/C][/ROW]
[ROW][C]8[/C][C]0[/C][C]-1.82857142857143[/C][C]1.82857142857143[/C][/ROW]
[ROW][C]9[/C][C]-4[/C][C]-3.99999999999998[/C][C]-1.90958360235527e-14[/C][/ROW]
[ROW][C]10[/C][C]-14[/C][C]-5.82857142857143[/C][C]-8.17142857142857[/C][/ROW]
[ROW][C]11[/C][C]-18[/C][C]-5.42857142857144[/C][C]-12.5714285714286[/C][/ROW]
[ROW][C]12[/C][C]-8[/C][C]-3.22857142857142[/C][C]-4.77142857142858[/C][/ROW]
[ROW][C]13[/C][C]-1[/C][C]1.36190476190476[/C][C]-2.36190476190476[/C][/ROW]
[ROW][C]14[/C][C]1[/C][C]0.0857142857142938[/C][C]0.914285714285706[/C][/ROW]
[ROW][C]15[/C][C]2[/C][C]-1.11428571428572[/C][C]3.11428571428572[/C][/ROW]
[ROW][C]16[/C][C]0[/C][C]-1.71428571428571[/C][C]1.71428571428571[/C][/ROW]
[ROW][C]17[/C][C]1[/C][C]-2.71428571428571[/C][C]3.71428571428571[/C][/ROW]
[ROW][C]18[/C][C]0[/C][C]-1.71428571428571[/C][C]1.71428571428571[/C][/ROW]
[ROW][C]19[/C][C]-1[/C][C]-3.31428571428571[/C][C]2.31428571428571[/C][/ROW]
[ROW][C]20[/C][C]-3[/C][C]-3.51428571428571[/C][C]0.514285714285712[/C][/ROW]
[ROW][C]21[/C][C]-3[/C][C]-3.47142857142858[/C][C]0.471428571428576[/C][/ROW]
[ROW][C]22[/C][C]-3[/C][C]-7.51428571428571[/C][C]4.51428571428571[/C][/ROW]
[ROW][C]23[/C][C]-4[/C][C]-7.11428571428571[/C][C]3.11428571428571[/C][/ROW]
[ROW][C]24[/C][C]-8[/C][C]-4.91428571428571[/C][C]-3.08571428571429[/C][/ROW]
[ROW][C]25[/C][C]-9[/C][C]-0.323809523809526[/C][C]-8.67619047619047[/C][/ROW]
[ROW][C]26[/C][C]-13[/C][C]-1.59999999999999[/C][C]-11.4000000000000[/C][/ROW]
[ROW][C]27[/C][C]-18[/C][C]-2.80000000000001[/C][C]-15.2[/C][/ROW]
[ROW][C]28[/C][C]-11[/C][C]-3.4[/C][C]-7.6[/C][/ROW]
[ROW][C]29[/C][C]-9[/C][C]-4.4[/C][C]-4.6[/C][/ROW]
[ROW][C]30[/C][C]-10[/C][C]-3.4[/C][C]-6.6[/C][/ROW]
[ROW][C]31[/C][C]-13[/C][C]-5[/C][C]-8[/C][/ROW]
[ROW][C]32[/C][C]-11[/C][C]-5.2[/C][C]-5.8[/C][/ROW]
[ROW][C]33[/C][C]-5[/C][C]-5.15714285714286[/C][C]0.157142857142863[/C][/ROW]
[ROW][C]34[/C][C]-15[/C][C]-9.2[/C][C]-5.8[/C][/ROW]
[ROW][C]35[/C][C]-6[/C][C]-8.8[/C][C]2.80000000000000[/C][/ROW]
[ROW][C]36[/C][C]-6[/C][C]-6.6[/C][C]0.599999999999997[/C][/ROW]
[ROW][C]37[/C][C]-3[/C][C]-2.00952380952381[/C][C]-0.990476190476187[/C][/ROW]
[ROW][C]38[/C][C]-1[/C][C]-3.28571428571427[/C][C]2.28571428571427[/C][/ROW]
[ROW][C]39[/C][C]-3[/C][C]-4.48571428571429[/C][C]1.48571428571429[/C][/ROW]
[ROW][C]40[/C][C]-4[/C][C]-5.08571428571428[/C][C]1.08571428571428[/C][/ROW]
[ROW][C]41[/C][C]-6[/C][C]-6.08571428571428[/C][C]0.0857142857142834[/C][/ROW]
[ROW][C]42[/C][C]0[/C][C]-5.08571428571429[/C][C]5.08571428571429[/C][/ROW]
[ROW][C]43[/C][C]-4[/C][C]-6.68571428571429[/C][C]2.68571428571429[/C][/ROW]
[ROW][C]44[/C][C]-2[/C][C]-6.88571428571428[/C][C]4.88571428571428[/C][/ROW]
[ROW][C]45[/C][C]-2[/C][C]-6.84285714285715[/C][C]4.84285714285715[/C][/ROW]
[ROW][C]46[/C][C]-6[/C][C]-10.8857142857143[/C][C]4.88571428571429[/C][/ROW]
[ROW][C]47[/C][C]-7[/C][C]-10.4857142857143[/C][C]3.48571428571429[/C][/ROW]
[ROW][C]48[/C][C]-6[/C][C]-8.28571428571428[/C][C]2.28571428571428[/C][/ROW]
[ROW][C]49[/C][C]-6[/C][C]-3.6952380952381[/C][C]-2.3047619047619[/C][/ROW]
[ROW][C]50[/C][C]-3[/C][C]-4.97142857142856[/C][C]1.97142857142856[/C][/ROW]
[ROW][C]51[/C][C]-2[/C][C]-6.17142857142858[/C][C]4.17142857142858[/C][/ROW]
[ROW][C]52[/C][C]-5[/C][C]-6.77142857142857[/C][C]1.77142857142857[/C][/ROW]
[ROW][C]53[/C][C]-11[/C][C]-7.77142857142857[/C][C]-3.22857142857143[/C][/ROW]
[ROW][C]54[/C][C]-11[/C][C]-6.77142857142857[/C][C]-4.22857142857143[/C][/ROW]
[ROW][C]55[/C][C]-11[/C][C]-8.37142857142857[/C][C]-2.62857142857143[/C][/ROW]
[ROW][C]56[/C][C]-10[/C][C]-8.57142857142857[/C][C]-1.42857142857143[/C][/ROW]
[ROW][C]57[/C][C]-14[/C][C]-8.52857142857143[/C][C]-5.47142857142857[/C][/ROW]
[ROW][C]58[/C][C]-8[/C][C]-12.5714285714286[/C][C]4.57142857142857[/C][/ROW]
[ROW][C]59[/C][C]-9[/C][C]-12.1714285714286[/C][C]3.17142857142857[/C][/ROW]
[ROW][C]60[/C][C]-5[/C][C]-9.97142857142857[/C][C]4.97142857142857[/C][/ROW]
[ROW][C]61[/C][C]-1[/C][C]-5.38095238095239[/C][C]4.38095238095239[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25817&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25817&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1133.047619047619079.95238095238093
281.771428571428526.22857142857148
370.5714285714285786.42857142857142
43-0.02857142857143563.02857142857144
53-1.028571428571424.02857142857142
64-0.02857142857142674.02857142857143
74-1.628571428571435.62857142857143
80-1.828571428571431.82857142857143
9-4-3.99999999999998-1.90958360235527e-14
10-14-5.82857142857143-8.17142857142857
11-18-5.42857142857144-12.5714285714286
12-8-3.22857142857142-4.77142857142858
13-11.36190476190476-2.36190476190476
1410.08571428571429380.914285714285706
152-1.114285714285723.11428571428572
160-1.714285714285711.71428571428571
171-2.714285714285713.71428571428571
180-1.714285714285711.71428571428571
19-1-3.314285714285712.31428571428571
20-3-3.514285714285710.514285714285712
21-3-3.471428571428580.471428571428576
22-3-7.514285714285714.51428571428571
23-4-7.114285714285713.11428571428571
24-8-4.91428571428571-3.08571428571429
25-9-0.323809523809526-8.67619047619047
26-13-1.59999999999999-11.4000000000000
27-18-2.80000000000001-15.2
28-11-3.4-7.6
29-9-4.4-4.6
30-10-3.4-6.6
31-13-5-8
32-11-5.2-5.8
33-5-5.157142857142860.157142857142863
34-15-9.2-5.8
35-6-8.82.80000000000000
36-6-6.60.599999999999997
37-3-2.00952380952381-0.990476190476187
38-1-3.285714285714272.28571428571427
39-3-4.485714285714291.48571428571429
40-4-5.085714285714281.08571428571428
41-6-6.085714285714280.0857142857142834
420-5.085714285714295.08571428571429
43-4-6.685714285714292.68571428571429
44-2-6.885714285714284.88571428571428
45-2-6.842857142857154.84285714285715
46-6-10.88571428571434.88571428571429
47-7-10.48571428571433.48571428571429
48-6-8.285714285714282.28571428571428
49-6-3.6952380952381-2.3047619047619
50-3-4.971428571428561.97142857142856
51-2-6.171428571428584.17142857142858
52-5-6.771428571428571.77142857142857
53-11-7.77142857142857-3.22857142857143
54-11-6.77142857142857-4.22857142857143
55-11-8.37142857142857-2.62857142857143
56-10-8.57142857142857-1.42857142857143
57-14-8.52857142857143-5.47142857142857
58-8-12.57142857142864.57142857142857
59-9-12.17142857142863.17142857142857
60-5-9.971428571428574.97142857142857
61-1-5.380952380952394.38095238095239







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
170.3695713388739530.7391426777479060.630428661126047
180.2375754203173680.4751508406347350.762424579682632
190.1481791830172280.2963583660344560.851820816982772
200.09378140539945980.1875628107989200.90621859460054
210.05009994152802540.1001998830560510.949900058471975
220.4719054911720190.9438109823440380.528094508827981
230.7829927772735370.4340144454529250.217007222726463
240.6972981146089360.6054037707821280.302701885391064
250.7653574439640230.4692851120719530.234642556035977
260.8565957941431260.2868084117137470.143404205856874
270.9759437141372630.04811257172547360.0240562858627368
280.9713980508391910.05720389832161740.0286019491608087
290.9507433454342480.09851330913150440.0492566545657522
300.9342957588835060.1314084822329890.0657042411164944
310.9298970861125240.1402058277749520.0701029138874758
320.9245797231145560.1508405537708870.0754202768854437
330.8884019832975920.2231960334048150.111598016702407
340.9493385872862460.1013228254275070.0506614127137536
350.9570491176426630.0859017647146750.0429508823573375
360.9572089871281040.08558202574379260.0427910128718963
370.9471808739986540.1056382520026910.0528191260013457
380.9265600319012860.1468799361974280.0734399680987141
390.9110847071698120.1778305856603750.0889152928301875
400.8687387762912560.2625224474174870.131261223708744
410.7826424883693230.4347150232613540.217357511630677
420.7729922413371320.4540155173257350.227007758662868
430.6717827219701570.6564345560596850.328217278029843
440.5870015411791440.8259969176417130.412998458820856

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
17 & 0.369571338873953 & 0.739142677747906 & 0.630428661126047 \tabularnewline
18 & 0.237575420317368 & 0.475150840634735 & 0.762424579682632 \tabularnewline
19 & 0.148179183017228 & 0.296358366034456 & 0.851820816982772 \tabularnewline
20 & 0.0937814053994598 & 0.187562810798920 & 0.90621859460054 \tabularnewline
21 & 0.0500999415280254 & 0.100199883056051 & 0.949900058471975 \tabularnewline
22 & 0.471905491172019 & 0.943810982344038 & 0.528094508827981 \tabularnewline
23 & 0.782992777273537 & 0.434014445452925 & 0.217007222726463 \tabularnewline
24 & 0.697298114608936 & 0.605403770782128 & 0.302701885391064 \tabularnewline
25 & 0.765357443964023 & 0.469285112071953 & 0.234642556035977 \tabularnewline
26 & 0.856595794143126 & 0.286808411713747 & 0.143404205856874 \tabularnewline
27 & 0.975943714137263 & 0.0481125717254736 & 0.0240562858627368 \tabularnewline
28 & 0.971398050839191 & 0.0572038983216174 & 0.0286019491608087 \tabularnewline
29 & 0.950743345434248 & 0.0985133091315044 & 0.0492566545657522 \tabularnewline
30 & 0.934295758883506 & 0.131408482232989 & 0.0657042411164944 \tabularnewline
31 & 0.929897086112524 & 0.140205827774952 & 0.0701029138874758 \tabularnewline
32 & 0.924579723114556 & 0.150840553770887 & 0.0754202768854437 \tabularnewline
33 & 0.888401983297592 & 0.223196033404815 & 0.111598016702407 \tabularnewline
34 & 0.949338587286246 & 0.101322825427507 & 0.0506614127137536 \tabularnewline
35 & 0.957049117642663 & 0.085901764714675 & 0.0429508823573375 \tabularnewline
36 & 0.957208987128104 & 0.0855820257437926 & 0.0427910128718963 \tabularnewline
37 & 0.947180873998654 & 0.105638252002691 & 0.0528191260013457 \tabularnewline
38 & 0.926560031901286 & 0.146879936197428 & 0.0734399680987141 \tabularnewline
39 & 0.911084707169812 & 0.177830585660375 & 0.0889152928301875 \tabularnewline
40 & 0.868738776291256 & 0.262522447417487 & 0.131261223708744 \tabularnewline
41 & 0.782642488369323 & 0.434715023261354 & 0.217357511630677 \tabularnewline
42 & 0.772992241337132 & 0.454015517325735 & 0.227007758662868 \tabularnewline
43 & 0.671782721970157 & 0.656434556059685 & 0.328217278029843 \tabularnewline
44 & 0.587001541179144 & 0.825996917641713 & 0.412998458820856 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25817&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]17[/C][C]0.369571338873953[/C][C]0.739142677747906[/C][C]0.630428661126047[/C][/ROW]
[ROW][C]18[/C][C]0.237575420317368[/C][C]0.475150840634735[/C][C]0.762424579682632[/C][/ROW]
[ROW][C]19[/C][C]0.148179183017228[/C][C]0.296358366034456[/C][C]0.851820816982772[/C][/ROW]
[ROW][C]20[/C][C]0.0937814053994598[/C][C]0.187562810798920[/C][C]0.90621859460054[/C][/ROW]
[ROW][C]21[/C][C]0.0500999415280254[/C][C]0.100199883056051[/C][C]0.949900058471975[/C][/ROW]
[ROW][C]22[/C][C]0.471905491172019[/C][C]0.943810982344038[/C][C]0.528094508827981[/C][/ROW]
[ROW][C]23[/C][C]0.782992777273537[/C][C]0.434014445452925[/C][C]0.217007222726463[/C][/ROW]
[ROW][C]24[/C][C]0.697298114608936[/C][C]0.605403770782128[/C][C]0.302701885391064[/C][/ROW]
[ROW][C]25[/C][C]0.765357443964023[/C][C]0.469285112071953[/C][C]0.234642556035977[/C][/ROW]
[ROW][C]26[/C][C]0.856595794143126[/C][C]0.286808411713747[/C][C]0.143404205856874[/C][/ROW]
[ROW][C]27[/C][C]0.975943714137263[/C][C]0.0481125717254736[/C][C]0.0240562858627368[/C][/ROW]
[ROW][C]28[/C][C]0.971398050839191[/C][C]0.0572038983216174[/C][C]0.0286019491608087[/C][/ROW]
[ROW][C]29[/C][C]0.950743345434248[/C][C]0.0985133091315044[/C][C]0.0492566545657522[/C][/ROW]
[ROW][C]30[/C][C]0.934295758883506[/C][C]0.131408482232989[/C][C]0.0657042411164944[/C][/ROW]
[ROW][C]31[/C][C]0.929897086112524[/C][C]0.140205827774952[/C][C]0.0701029138874758[/C][/ROW]
[ROW][C]32[/C][C]0.924579723114556[/C][C]0.150840553770887[/C][C]0.0754202768854437[/C][/ROW]
[ROW][C]33[/C][C]0.888401983297592[/C][C]0.223196033404815[/C][C]0.111598016702407[/C][/ROW]
[ROW][C]34[/C][C]0.949338587286246[/C][C]0.101322825427507[/C][C]0.0506614127137536[/C][/ROW]
[ROW][C]35[/C][C]0.957049117642663[/C][C]0.085901764714675[/C][C]0.0429508823573375[/C][/ROW]
[ROW][C]36[/C][C]0.957208987128104[/C][C]0.0855820257437926[/C][C]0.0427910128718963[/C][/ROW]
[ROW][C]37[/C][C]0.947180873998654[/C][C]0.105638252002691[/C][C]0.0528191260013457[/C][/ROW]
[ROW][C]38[/C][C]0.926560031901286[/C][C]0.146879936197428[/C][C]0.0734399680987141[/C][/ROW]
[ROW][C]39[/C][C]0.911084707169812[/C][C]0.177830585660375[/C][C]0.0889152928301875[/C][/ROW]
[ROW][C]40[/C][C]0.868738776291256[/C][C]0.262522447417487[/C][C]0.131261223708744[/C][/ROW]
[ROW][C]41[/C][C]0.782642488369323[/C][C]0.434715023261354[/C][C]0.217357511630677[/C][/ROW]
[ROW][C]42[/C][C]0.772992241337132[/C][C]0.454015517325735[/C][C]0.227007758662868[/C][/ROW]
[ROW][C]43[/C][C]0.671782721970157[/C][C]0.656434556059685[/C][C]0.328217278029843[/C][/ROW]
[ROW][C]44[/C][C]0.587001541179144[/C][C]0.825996917641713[/C][C]0.412998458820856[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25817&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25817&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
170.3695713388739530.7391426777479060.630428661126047
180.2375754203173680.4751508406347350.762424579682632
190.1481791830172280.2963583660344560.851820816982772
200.09378140539945980.1875628107989200.90621859460054
210.05009994152802540.1001998830560510.949900058471975
220.4719054911720190.9438109823440380.528094508827981
230.7829927772735370.4340144454529250.217007222726463
240.6972981146089360.6054037707821280.302701885391064
250.7653574439640230.4692851120719530.234642556035977
260.8565957941431260.2868084117137470.143404205856874
270.9759437141372630.04811257172547360.0240562858627368
280.9713980508391910.05720389832161740.0286019491608087
290.9507433454342480.09851330913150440.0492566545657522
300.9342957588835060.1314084822329890.0657042411164944
310.9298970861125240.1402058277749520.0701029138874758
320.9245797231145560.1508405537708870.0754202768854437
330.8884019832975920.2231960334048150.111598016702407
340.9493385872862460.1013228254275070.0506614127137536
350.9570491176426630.0859017647146750.0429508823573375
360.9572089871281040.08558202574379260.0427910128718963
370.9471808739986540.1056382520026910.0528191260013457
380.9265600319012860.1468799361974280.0734399680987141
390.9110847071698120.1778305856603750.0889152928301875
400.8687387762912560.2625224474174870.131261223708744
410.7826424883693230.4347150232613540.217357511630677
420.7729922413371320.4540155173257350.227007758662868
430.6717827219701570.6564345560596850.328217278029843
440.5870015411791440.8259969176417130.412998458820856







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level10.0357142857142857OK
10% type I error level50.178571428571429NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 1 & 0.0357142857142857 & OK \tabularnewline
10% type I error level & 5 & 0.178571428571429 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25817&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]1[/C][C]0.0357142857142857[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]5[/C][C]0.178571428571429[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25817&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25817&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level10.0357142857142857OK
10% type I error level50.178571428571429NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}