Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationFri, 02 Dec 2016 11:54:22 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/02/t1480677302gx9djr97x2utrsm.htm/, Retrieved Fri, 01 Nov 2024 03:47:56 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=297564, Retrieved Fri, 01 Nov 2024 03:47:56 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact123
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Kendall tau Correlation Matrix] [Hoofdhypothese co...] [2016-12-02 10:54:22] [f9ec87b450e3a9e81b1e03387661d62c] [Current]
Feedback Forum

Post a new message
Dataseries X:
3	4	3	4	13
5	5	5	4	16
5	4	4	4	17
5	4	4	4	15
4	4	3	4	16
5	5	5	5	16
5	5	5	4	16
5	5	4	1	17
5	4	3	3	17
5	5	5	4	17
4	4	5	3	15
5	5	5	5	16
5	5	4	4	14
4	4	3	4	16
3	4	4	3	17
5	5	5	5	16
5	3	3	5	17
4	4	4	4	16
2	5	1	2	15
5	5	4	5	16
5	5	4	5	15
5	5	4	2	17
4	4	4	3	14
4	5	5	4	16
4	5	4	4	15
5	5	4	5	16
5	5	4	3	16
4	4	4	2	13
5	5	4	5	15
5	5	5	5	17
1	1	1	2	15
5	5	4	5	13
4	5	4	3	17
4	4	4	3	15
4	4	4	4	14
5	5	4	4	14
4	4	5	3	18
4	4	4	3	15
5	4	4	4	17
3	3	4	4	13
5	5	5	5	16
5	5	5	4	15
2	2	1	2	15
3	3	3	4	16
4	4	3	5	15
4	5	3	4	13
5	5	4	4	17
5	5	5	3	18
4	4	4	4	18
5	5	3	4	11
5	5	5	4	14
4	4	4	4	13
5	5	4	5	15
4	5	3	1	17
4	4	4	4	16
3	4	3	3	15
4	4	3	1	17
4	5	4	4	16
5	4	4	4	16
4	5	4	4	16
4	5	4	3	15
4	4	4	4	12
4	3	3	4	17
4	4	4	4	14
2	4	4	3	14
4	5	4	3	16
4	4	3	3	15
5	5	5	5	15
3	3	3	3	14
3	4	3	3	13
5	4	5	4	18
4	3	3	4	15
5	5	5	4	16
4	5	4	5	14
4	3	3	4	15
5	5	3	5	17
5	5	5	4	16
5	4	3	3	10
4	4	3	3	16
5	4	4	4	17
5	5	5	4	17
2	5	4	2	20
5	4	5	5	17
5	5	4	4	18
5	5	5	5	15
5	4	4	2	17
4	4	4	3	14
4	4	4	3	15
5	5	5	5	17
4	4	4	3	16
5	5	5	4	17
5	5	4	4	15
5	4	5	4	16
4	4	4	3	18
5	5	5	5	18
5	5	5	2	16
5	4	5	4	17
5	5	5	4	15
5	5	5	5	13
4	3	3	3	15
4	4	5	4	17
4	4	4	3	16
4	4	4	4	16
5	5	5	3	15
5	5	4	4	16
4	4	2	4	16
3	4	4	4	14
3	4	3	2	15
4	4	5	4	12
5	5	4	4	16
5	4	4	4	16
4	4	5	4	17
5	5	5	5	16
5	4	4	3	14
4	4	3	3	15
4	4	3	4	14
5	5	4	4	16
5	5	5	5	15
5	5	3	4	17
5	5	3	4	15
4	5	4	4	16
5	4	4	4	16
3	4	4	4	15
5	5	4	3	15
5	4	5	4	13
5	5	5	5	18
4	4	4	3	13
4	4	4	4	11
4	4	5	5	18
4	4	4	3	15
5	4	5	4	19
5	5	5	5	17
5	5	5	4	13
4	4	4	2	14
5	4	4	2	13
5	4	4	4	17
5	4	5	4	14
5	5	5	5	19
5	3	5	4	14
5	4	5	4	16
4	4	4	3	12
5	4	4	3	16
3	3	3	2	16
3	4	4	4	15
4	5	4	5	12
4	5	4	4	15
3	5	3	5	17
3	4	3	2	14
5	5	5	4	15
5	5	4	4	18
5	4	4	2	15
5	4	4	4	18
5	5	5	4	15
5	4	5	4	15
5	5	5	4	16
5	4	5	2	13
4	4	4	4	16
4	4	5	3	14
2	4	5	3	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time1 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297564&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]1 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297564&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297564&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center







Correlations for all pairs of data series (method=kendall)
ITH1ITH2ITH3ITH4TVDCsum
ITH110.450.460.3590.198
ITH20.4510.3360.3290.141
ITH30.460.33610.2940.138
ITH40.3590.3290.29410.114
TVDCsum0.1980.1410.1380.1141

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=kendall) \tabularnewline
  & ITH1 & ITH2 & ITH3 & ITH4 & TVDCsum \tabularnewline
ITH1 & 1 & 0.45 & 0.46 & 0.359 & 0.198 \tabularnewline
ITH2 & 0.45 & 1 & 0.336 & 0.329 & 0.141 \tabularnewline
ITH3 & 0.46 & 0.336 & 1 & 0.294 & 0.138 \tabularnewline
ITH4 & 0.359 & 0.329 & 0.294 & 1 & 0.114 \tabularnewline
TVDCsum & 0.198 & 0.141 & 0.138 & 0.114 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297564&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=kendall)[/C][/ROW]
[ROW][C] [/C][C]ITH1[/C][C]ITH2[/C][C]ITH3[/C][C]ITH4[/C][C]TVDCsum[/C][/ROW]
[ROW][C]ITH1[/C][C]1[/C][C]0.45[/C][C]0.46[/C][C]0.359[/C][C]0.198[/C][/ROW]
[ROW][C]ITH2[/C][C]0.45[/C][C]1[/C][C]0.336[/C][C]0.329[/C][C]0.141[/C][/ROW]
[ROW][C]ITH3[/C][C]0.46[/C][C]0.336[/C][C]1[/C][C]0.294[/C][C]0.138[/C][/ROW]
[ROW][C]ITH4[/C][C]0.359[/C][C]0.329[/C][C]0.294[/C][C]1[/C][C]0.114[/C][/ROW]
[ROW][C]TVDCsum[/C][C]0.198[/C][C]0.141[/C][C]0.138[/C][C]0.114[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297564&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297564&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=kendall)
ITH1ITH2ITH3ITH4TVDCsum
ITH110.450.460.3590.198
ITH20.4510.3360.3290.141
ITH30.460.33610.2940.138
ITH40.3590.3290.29410.114
TVDCsum0.1980.1410.1380.1141







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
ITH1;ITH20.50960.47560.4501
p-value(0)(0)(0)
ITH1;ITH30.55050.49750.4599
p-value(0)(0)(0)
ITH1;ITH40.39470.39590.3592
p-value(0)(0)(0)
ITH1;TVDCsum0.14810.23520.1979
p-value(0.0625)(0.0028)(0.0032)
ITH2;ITH30.45040.35780.3363
p-value(0)(0)(0)
ITH2;ITH40.32410.36410.3293
p-value(0)(0)(0)
ITH2;TVDCsum0.14940.16760.1409
p-value(0.0601)(0.0347)(0.0388)
ITH3;ITH40.36210.33110.2944
p-value(0)(0)(0)
ITH3;TVDCsum0.16190.16730.1384
p-value(0.0415)(0.035)(0.0368)
ITH4;TVDCsum0.0770.13590.1136
p-value(0.3346)(0.0876)(0.0814)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
ITH1;ITH2 & 0.5096 & 0.4756 & 0.4501 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
ITH1;ITH3 & 0.5505 & 0.4975 & 0.4599 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
ITH1;ITH4 & 0.3947 & 0.3959 & 0.3592 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
ITH1;TVDCsum & 0.1481 & 0.2352 & 0.1979 \tabularnewline
p-value & (0.0625) & (0.0028) & (0.0032) \tabularnewline
ITH2;ITH3 & 0.4504 & 0.3578 & 0.3363 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
ITH2;ITH4 & 0.3241 & 0.3641 & 0.3293 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
ITH2;TVDCsum & 0.1494 & 0.1676 & 0.1409 \tabularnewline
p-value & (0.0601) & (0.0347) & (0.0388) \tabularnewline
ITH3;ITH4 & 0.3621 & 0.3311 & 0.2944 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
ITH3;TVDCsum & 0.1619 & 0.1673 & 0.1384 \tabularnewline
p-value & (0.0415) & (0.035) & (0.0368) \tabularnewline
ITH4;TVDCsum & 0.077 & 0.1359 & 0.1136 \tabularnewline
p-value & (0.3346) & (0.0876) & (0.0814) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297564&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]ITH1;ITH2[/C][C]0.5096[/C][C]0.4756[/C][C]0.4501[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]ITH1;ITH3[/C][C]0.5505[/C][C]0.4975[/C][C]0.4599[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]ITH1;ITH4[/C][C]0.3947[/C][C]0.3959[/C][C]0.3592[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]ITH1;TVDCsum[/C][C]0.1481[/C][C]0.2352[/C][C]0.1979[/C][/ROW]
[ROW][C]p-value[/C][C](0.0625)[/C][C](0.0028)[/C][C](0.0032)[/C][/ROW]
[ROW][C]ITH2;ITH3[/C][C]0.4504[/C][C]0.3578[/C][C]0.3363[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]ITH2;ITH4[/C][C]0.3241[/C][C]0.3641[/C][C]0.3293[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]ITH2;TVDCsum[/C][C]0.1494[/C][C]0.1676[/C][C]0.1409[/C][/ROW]
[ROW][C]p-value[/C][C](0.0601)[/C][C](0.0347)[/C][C](0.0388)[/C][/ROW]
[ROW][C]ITH3;ITH4[/C][C]0.3621[/C][C]0.3311[/C][C]0.2944[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]ITH3;TVDCsum[/C][C]0.1619[/C][C]0.1673[/C][C]0.1384[/C][/ROW]
[ROW][C]p-value[/C][C](0.0415)[/C][C](0.035)[/C][C](0.0368)[/C][/ROW]
[ROW][C]ITH4;TVDCsum[/C][C]0.077[/C][C]0.1359[/C][C]0.1136[/C][/ROW]
[ROW][C]p-value[/C][C](0.3346)[/C][C](0.0876)[/C][C](0.0814)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297564&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297564&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
ITH1;ITH20.50960.47560.4501
p-value(0)(0)(0)
ITH1;ITH30.55050.49750.4599
p-value(0)(0)(0)
ITH1;ITH40.39470.39590.3592
p-value(0)(0)(0)
ITH1;TVDCsum0.14810.23520.1979
p-value(0.0625)(0.0028)(0.0032)
ITH2;ITH30.45040.35780.3363
p-value(0)(0)(0)
ITH2;ITH40.32410.36410.3293
p-value(0)(0)(0)
ITH2;TVDCsum0.14940.16760.1409
p-value(0.0601)(0.0347)(0.0388)
ITH3;ITH40.36210.33110.2944
p-value(0)(0)(0)
ITH3;TVDCsum0.16190.16730.1384
p-value(0.0415)(0.035)(0.0368)
ITH4;TVDCsum0.0770.13590.1136
p-value(0.3346)(0.0876)(0.0814)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.60.70.7
0.020.60.70.7
0.030.60.70.7
0.040.60.90.9
0.050.70.90.9
0.060.70.90.9
0.070.90.90.9
0.080.90.90.9
0.090.911
0.10.911

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 0.6 & 0.7 & 0.7 \tabularnewline
0.02 & 0.6 & 0.7 & 0.7 \tabularnewline
0.03 & 0.6 & 0.7 & 0.7 \tabularnewline
0.04 & 0.6 & 0.9 & 0.9 \tabularnewline
0.05 & 0.7 & 0.9 & 0.9 \tabularnewline
0.06 & 0.7 & 0.9 & 0.9 \tabularnewline
0.07 & 0.9 & 0.9 & 0.9 \tabularnewline
0.08 & 0.9 & 0.9 & 0.9 \tabularnewline
0.09 & 0.9 & 1 & 1 \tabularnewline
0.1 & 0.9 & 1 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297564&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]0.6[/C][C]0.7[/C][C]0.7[/C][/ROW]
[ROW][C]0.02[/C][C]0.6[/C][C]0.7[/C][C]0.7[/C][/ROW]
[ROW][C]0.03[/C][C]0.6[/C][C]0.7[/C][C]0.7[/C][/ROW]
[ROW][C]0.04[/C][C]0.6[/C][C]0.9[/C][C]0.9[/C][/ROW]
[ROW][C]0.05[/C][C]0.7[/C][C]0.9[/C][C]0.9[/C][/ROW]
[ROW][C]0.06[/C][C]0.7[/C][C]0.9[/C][C]0.9[/C][/ROW]
[ROW][C]0.07[/C][C]0.9[/C][C]0.9[/C][C]0.9[/C][/ROW]
[ROW][C]0.08[/C][C]0.9[/C][C]0.9[/C][C]0.9[/C][/ROW]
[ROW][C]0.09[/C][C]0.9[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.1[/C][C]0.9[/C][C]1[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297564&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297564&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.60.70.7
0.020.60.70.7
0.030.60.70.7
0.040.60.90.9
0.050.70.90.9
0.060.70.90.9
0.070.90.90.9
0.080.90.90.9
0.090.911
0.10.911



Parameters (Session):
Parameters (R input):
par1 = kendall ;
R code (references can be found in the software module):
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
x <- na.omit(x)
y <- t(na.omit(t(y)))
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
print(n)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')