Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationWed, 17 Dec 2014 15:40:25 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/17/t1418831034exy3d0yvykhjhus.htm/, Retrieved Sun, 19 May 2024 19:19:55 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=270426, Retrieved Sun, 19 May 2024 19:19:55 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact111
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Blocked Bootstrap Plot - Central Tendency] [] [2014-11-02 13:37:17] [cc401d1001c65f55a3dfc6f2420e9570]
- RMPD  [Simple Linear Regression] [] [2014-11-02 15:26:26] [cc401d1001c65f55a3dfc6f2420e9570]
- RMPD      [Kendall tau Correlation Matrix] [] [2014-12-17 15:40:25] [6e98989d1e11d52934121e5a163a7817] [Current]
- R PD        [Kendall tau Correlation Matrix] [] [2014-12-18 19:14:03] [bcd8153d44f369b7624d3c1b4621c4c3]
-    D          [Kendall tau Correlation Matrix] [] [2014-12-18 19:15:24] [bcd8153d44f369b7624d3c1b4621c4c3]
Feedback Forum

Post a new message
Dataseries X:
21	7.5
26	2.5
22	6
22	6.5
18	1
23	1
12	5.5
20	8.5
22	6.5
21	4.5
19	2
22	5
15	0.5
20	5
19	5
18	2.5
15	5
20	5.5
21	3.5
21	3
15	4
16	0.5
23	6.5
21	4.5
18	7.5
25	5.5
9	4
30	7.5
20	7
23	4
16	5.5
16	2.5
19	5.5
25	0.5
25	3.5
18	2.5
23	4.5
21	4.5
10	4.5
14	6
22	2.5
26	5
23	0
23	5
24	6.5
24	5
18	6
23	4.5
15	5.5
19	1
16	7.5
25	6
23	5
17	1
19	5
21	6.5
18	7
27	4.5
21	0
13	8.5
8	3.5
29	7.5
28	3.5
23	6
21	1.5
19	9
19	3.5
20	3.5
18	4
19	6.5
17	7.5
19	6
25	5
19	5.5
22	3.5
23	7.5
26	1
14	6.5
28	NA
16	6.5
24	6.5
20	7
12	3.5
24	1.5
22	4
12	7.5
22	4.5
20	0
10	3.5
23	5.5
17	5
22	4.5
24	2.5
18	7.5
21	7
20	0
20	4.5
22	3
19	1.5
20	3.5
26	2.5
23	5.5
24	8
21	1
21	5
19	4.5
8	3
17	3
20	8
11	2.5
8	7
15	0
18	1
18	3.5
19	5.5
19	5.5
23	0.5
22	7.5
21	9
25	9.5
30	8.5
17	7
27	8
23	10
23	7
18	8.5
18	9
23	9.5
19	4
15	6
20	8
16	5.5
24	9.5
25	7.5
25	7
19	7.5
19	8
16	7
19	7
19	6
23	10
21	2.5
22	9
19	8
20	6
20	8.5
3	6
23	9
14	8
23	8
20	9
15	5.5
13	5
16	7
7	5.5
24	9
17	2
24	8.5
24	9
19	8.5
25	9
20	7.5
28	10
23	9
27	7.5
18	6
28	10.5
21	8.5
19	8
23	10
27	10.5
22	6.5
28	9.5
25	8.5
21	7.5
22	5
28	8
20	10
29	7
25	7.5
25	7.5
20	9.5
20	6
16	10
20	7
20	3
23	6
18	7
25	10
18	7
19	3.5
25	8
25	10
25	5.5
24	6
19	6.5
26	6.5
10	8.5
17	4
13	9.5
17	8
30	8.5
25	5.5
4	7
16	9
21	8
23	10
22	8
17	6
20	8
20	5
22	9
16	4.5
23	8.5
16	7
0	9.5
18	8.5
25	7.5
23	7.5
12	5
18	7
24	8
11	5.5
18	8.5
14	7.5
23	9.5
24	7
29	8
18	8.5
15	3.5
29	6.5
16	6.5
19	10.5
22	8.5
16	8
23	10
23	10
19	9.5
4	9
20	10
24	7.5
20	4.5
4	4.5
24	0.5
22	6.5
16	4.5
3	5.5
15	5
24	6
17	4
20	8
27	10.5
23	8.5
26	6.5
23	8
17	8.5
20	5.5
22	7
19	5
24	3.5
19	5
23	9
15	8.5
27	5
26	9.5
22	3
22	1.5
18	6
15	0.5
22	6.5
27	7.5
10	4.5
20	8
17	9
23	7.5
19	8.5
13	7
27	9.5
23	6.5
16	9.5
25	6
2	8
26	9.5
20	8
23	8
22	9
24	5




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time11 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 11 seconds \tabularnewline
R Server & 'Gertrude Mary Cox' @ cox.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270426&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]11 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gertrude Mary Cox' @ cox.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270426&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270426&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time11 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net







Correlations for all pairs of data series (method=kendall)
NumEx
Num10.133
Ex0.1331

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=kendall) \tabularnewline
  & Num & Ex \tabularnewline
Num & 1 & 0.133 \tabularnewline
Ex & 0.133 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270426&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=kendall)[/C][/ROW]
[ROW][C] [/C][C]Num[/C][C]Ex[/C][/ROW]
[ROW][C]Num[/C][C]1[/C][C]0.133[/C][/ROW]
[ROW][C]Ex[/C][C]0.133[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270426&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270426&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=kendall)
NumEx
Num10.133
Ex0.1331







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
Num;Ex0.13740.18840.1326
p-value(0.0201)(0.0014)(0.0016)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
Num;Ex & 0.1374 & 0.1884 & 0.1326 \tabularnewline
p-value & (0.0201) & (0.0014) & (0.0016) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270426&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]Num;Ex[/C][C]0.1374[/C][C]0.1884[/C][C]0.1326[/C][/ROW]
[ROW][C]p-value[/C][C](0.0201)[/C][C](0.0014)[/C][C](0.0016)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270426&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270426&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
Num;Ex0.13740.18840.1326
p-value(0.0201)(0.0014)(0.0016)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01011
0.02011
0.03111
0.04111
0.05111
0.06111
0.07111
0.08111
0.09111
0.1111

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 0 & 1 & 1 \tabularnewline
0.02 & 0 & 1 & 1 \tabularnewline
0.03 & 1 & 1 & 1 \tabularnewline
0.04 & 1 & 1 & 1 \tabularnewline
0.05 & 1 & 1 & 1 \tabularnewline
0.06 & 1 & 1 & 1 \tabularnewline
0.07 & 1 & 1 & 1 \tabularnewline
0.08 & 1 & 1 & 1 \tabularnewline
0.09 & 1 & 1 & 1 \tabularnewline
0.1 & 1 & 1 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270426&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]0[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.02[/C][C]0[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.03[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.04[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.05[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.06[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.07[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.08[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.09[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.1[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270426&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270426&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01011
0.02011
0.03111
0.04111
0.05111
0.06111
0.07111
0.08111
0.09111
0.1111



Parameters (Session):
par1 = 14 ; par2 = grey ; par3 = TRUE ; par4 = Unknown ;
Parameters (R input):
par1 = kendall ;
R code (references can be found in the software module):
par1 <- 'kendall'
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
n
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')