Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationFri, 02 Dec 2016 21:23:37 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/02/t1480710470onxbwx542dcticd.htm/, Retrieved Sat, 18 May 2024 05:35:42 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=297601, Retrieved Sat, 18 May 2024 05:35:42 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact73
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Kendall tau Correlation Matrix] [] [2016-12-02 20:23:37] [219800a2f11ddd28e3280d87dbde8c8d] [Current]
Feedback Forum

Post a new message
Dataseries X:
5	5	4	1	14
3	3	2	5	19
5	5	3	1	17
5	4	2	2	17
5	4	2	1	15
5	5	3	4	20
5	3	3	1	15
5	5	2	1	19
5	5	2	1	15
5	5	4	2	15
4	5	2	1	19
2	4	2	4	NA
5	4	3	1	20
4	5	2	5	18
5	5	3	2	15
4	5	2	1	14
5	4	2	NA	20
5	5	NA	NA	NA
5	5	3	2	16
4	5	2	1	16
4	5	2	4	16
3	4	3	1	10
5	5	1	2	19
4	4	2	3	19
5	5	3	1	16
4	4	2	4	15
5	5	2	2	18
5	4	3	3	17
5	5	5	1	19
5	5	2	4	17
5	5	5	1	NA
5	5	2	1	19
5	5	2	1	20
5	4	4	1	5
5	4	1	3	19
4	4	2	4	16
4	4	2	2	15
5	5	3	4	16
5	5	2	2	18
5	5	3	2	16
5	5	2	1	15
5	5	3	1	17
5	5	4	1	NA
5	5	4	5	20
5	5	3	1	19
5	5	2	1	7
5	4	2	1	13
NA	NA	1	NA	16
4	5	4	1	16
5	5	4	1	NA
5	5	3	2	18
4	4	2	2	18
5	5	2	2	16
3	4	2	2	17
4	3	2	3	19
3	3	3	1	16
5	4	2	NA	19
5	5	2	2	13
5	5	3	1	16
5	4	3	3	13
5	5	2	3	12
5	5	2	1	17
5	5	4	1	17
5	5	4	2	17
4	4	3	1	16
5	5	4	3	16
4	4	4	3	14
5	5	4	NA	16
2	2	4	4	13
4	3	5	4	16
5	5	3	2	14
5	5	4	1	20
4	3	4	1	12
5	5	2	1	13
2	3	2	3	18
5	4	3	2	14
3	3	4	1	19
4	5	2	1	18
4	4	5	1	14
5	5	1	1	18
5	5	3	1	19
4	4	3	1	15
4	4	2	3	14
5	5	2	1	17
4	5	1	4	19
4	4	2	2	13
5	5	1	4	19
5	5	2	1	18
5	5	2	1	20
4	4	2	1	15
4	4	2	2	15
4	4	3	5	15
3	3	2	3	20
4	4	1	4	15
5	5	1	1	19
5	5	3	4	18
4	4	2	4	18
5	5	3	2	15
2	2	1	3	20
5	5	2	1	17
5	5	2	1	12
4	4	3	4	18
3	5	2	4	19
5	5	2	1	20
4	4	3	3	NA
5	5	1	1	17
5	5	4	5	15
5	5	3	2	16
5	5	2	2	18
5	5	3	1	18
4	5	3	3	14
5	4	3	1	15
5	5	4	1	12
5	3	3	3	17
4	4	2	1	14
5	5	3	4	18
5	5	2	1	17
2	1	1	5	17
5	5	1	1	20
5	5	2	1	16
5	4	4	4	14
5	4	3	2	15
5	5	2	1	18
5	5	2	4	20
5	5	3	1	17
5	5	3	1	17
4	5	3	2	17
3	3	2	2	17
5	4	2	1	15
5	5	2	1	17
5	5	3	1	18
5	5	4	4	17
4	4	2	4	20
4	5	2	3	15
4	4	1	4	16
5	4	3	1	15
4	4	3	5	18
NA	NA	4	3	11
4	4	3	2	15
5	5	1	3	18
2	2	1	3	20
5	5	2	1	19
4	4	1	4	14
5	5	5	1	16
5	5	3	1	15
4	4	2	3	17
5	4	2	3	18
4	2	4	2	20
5	5	2	4	17
5	5	4	4	18
5	5	4	2	15
4	4	3	4	16
5	5	4	4	11
5	5	3	2	15
5	4	4	1	18
5	5	3	1	17
5	5	4	1	16
2	2	2	3	12
5	5	4	3	19
3	3	1	4	18
5	5	4	1	15
5	4	3	NA	17
5	5	2	3	19
4	4	2	3	18
5	5	2	NA	19
5	5	4	1	16
5	5	3	2	16
5	4	3	2	16
5	2	2	4	14




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time1 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297601&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]1 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297601&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297601&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center







Correlations for all pairs of data series (method=kendall)
EP1EP2EP3EP4ITH
EP110.6040.193-0.2940.014
EP20.60410.074-0.2450.109
EP30.1930.0741-0.102-0.198
EP4-0.294-0.245-0.10210.07
ITH0.0140.109-0.1980.071

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=kendall) \tabularnewline
  & EP1 & EP2 & EP3 & EP4 & ITH \tabularnewline
EP1 & 1 & 0.604 & 0.193 & -0.294 & 0.014 \tabularnewline
EP2 & 0.604 & 1 & 0.074 & -0.245 & 0.109 \tabularnewline
EP3 & 0.193 & 0.074 & 1 & -0.102 & -0.198 \tabularnewline
EP4 & -0.294 & -0.245 & -0.102 & 1 & 0.07 \tabularnewline
ITH & 0.014 & 0.109 & -0.198 & 0.07 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297601&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=kendall)[/C][/ROW]
[ROW][C] [/C][C]EP1[/C][C]EP2[/C][C]EP3[/C][C]EP4[/C][C]ITH[/C][/ROW]
[ROW][C]EP1[/C][C]1[/C][C]0.604[/C][C]0.193[/C][C]-0.294[/C][C]0.014[/C][/ROW]
[ROW][C]EP2[/C][C]0.604[/C][C]1[/C][C]0.074[/C][C]-0.245[/C][C]0.109[/C][/ROW]
[ROW][C]EP3[/C][C]0.193[/C][C]0.074[/C][C]1[/C][C]-0.102[/C][C]-0.198[/C][/ROW]
[ROW][C]EP4[/C][C]-0.294[/C][C]-0.245[/C][C]-0.102[/C][C]1[/C][C]0.07[/C][/ROW]
[ROW][C]ITH[/C][C]0.014[/C][C]0.109[/C][C]-0.198[/C][C]0.07[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297601&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297601&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=kendall)
EP1EP2EP3EP4ITH
EP110.6040.193-0.2940.014
EP20.60410.074-0.2450.109
EP30.1930.0741-0.102-0.198
EP4-0.294-0.245-0.10210.07
ITH0.0140.109-0.1980.071







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
EP1;EP20.7130.62630.6044
p-value(0)(0)(0)
EP1;EP30.20980.2140.1926
p-value(0.0086)(0.0073)(0.007)
EP1;EP4-0.3274-0.3339-0.2936
p-value(0)(0)(0)
EP1;ITH-0.02320.01520.014
p-value(0.7734)(0.8508)(0.835)
EP2;EP30.09360.08260.074
p-value(0.2451)(0.3054)(0.2959)
EP2;EP4-0.2895-0.2784-0.2453
p-value(2e-04)(4e-04)(5e-04)
EP2;ITH0.05890.12640.1086
p-value(0.4654)(0.1159)(0.1024)
EP3;EP4-0.113-0.1167-0.1018
p-value(0.1602)(0.147)(0.133)
EP3;ITH-0.2236-0.2413-0.1977
p-value(0.005)(0.0024)(0.0021)
EP4;ITH0.12450.08960.0702
p-value(0.1216)(0.2661)(0.2697)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
EP1;EP2 & 0.713 & 0.6263 & 0.6044 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
EP1;EP3 & 0.2098 & 0.214 & 0.1926 \tabularnewline
p-value & (0.0086) & (0.0073) & (0.007) \tabularnewline
EP1;EP4 & -0.3274 & -0.3339 & -0.2936 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
EP1;ITH & -0.0232 & 0.0152 & 0.014 \tabularnewline
p-value & (0.7734) & (0.8508) & (0.835) \tabularnewline
EP2;EP3 & 0.0936 & 0.0826 & 0.074 \tabularnewline
p-value & (0.2451) & (0.3054) & (0.2959) \tabularnewline
EP2;EP4 & -0.2895 & -0.2784 & -0.2453 \tabularnewline
p-value & (2e-04) & (4e-04) & (5e-04) \tabularnewline
EP2;ITH & 0.0589 & 0.1264 & 0.1086 \tabularnewline
p-value & (0.4654) & (0.1159) & (0.1024) \tabularnewline
EP3;EP4 & -0.113 & -0.1167 & -0.1018 \tabularnewline
p-value & (0.1602) & (0.147) & (0.133) \tabularnewline
EP3;ITH & -0.2236 & -0.2413 & -0.1977 \tabularnewline
p-value & (0.005) & (0.0024) & (0.0021) \tabularnewline
EP4;ITH & 0.1245 & 0.0896 & 0.0702 \tabularnewline
p-value & (0.1216) & (0.2661) & (0.2697) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297601&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]EP1;EP2[/C][C]0.713[/C][C]0.6263[/C][C]0.6044[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]EP1;EP3[/C][C]0.2098[/C][C]0.214[/C][C]0.1926[/C][/ROW]
[ROW][C]p-value[/C][C](0.0086)[/C][C](0.0073)[/C][C](0.007)[/C][/ROW]
[ROW][C]EP1;EP4[/C][C]-0.3274[/C][C]-0.3339[/C][C]-0.2936[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]EP1;ITH[/C][C]-0.0232[/C][C]0.0152[/C][C]0.014[/C][/ROW]
[ROW][C]p-value[/C][C](0.7734)[/C][C](0.8508)[/C][C](0.835)[/C][/ROW]
[ROW][C]EP2;EP3[/C][C]0.0936[/C][C]0.0826[/C][C]0.074[/C][/ROW]
[ROW][C]p-value[/C][C](0.2451)[/C][C](0.3054)[/C][C](0.2959)[/C][/ROW]
[ROW][C]EP2;EP4[/C][C]-0.2895[/C][C]-0.2784[/C][C]-0.2453[/C][/ROW]
[ROW][C]p-value[/C][C](2e-04)[/C][C](4e-04)[/C][C](5e-04)[/C][/ROW]
[ROW][C]EP2;ITH[/C][C]0.0589[/C][C]0.1264[/C][C]0.1086[/C][/ROW]
[ROW][C]p-value[/C][C](0.4654)[/C][C](0.1159)[/C][C](0.1024)[/C][/ROW]
[ROW][C]EP3;EP4[/C][C]-0.113[/C][C]-0.1167[/C][C]-0.1018[/C][/ROW]
[ROW][C]p-value[/C][C](0.1602)[/C][C](0.147)[/C][C](0.133)[/C][/ROW]
[ROW][C]EP3;ITH[/C][C]-0.2236[/C][C]-0.2413[/C][C]-0.1977[/C][/ROW]
[ROW][C]p-value[/C][C](0.005)[/C][C](0.0024)[/C][C](0.0021)[/C][/ROW]
[ROW][C]EP4;ITH[/C][C]0.1245[/C][C]0.0896[/C][C]0.0702[/C][/ROW]
[ROW][C]p-value[/C][C](0.1216)[/C][C](0.2661)[/C][C](0.2697)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297601&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297601&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
EP1;EP20.7130.62630.6044
p-value(0)(0)(0)
EP1;EP30.20980.2140.1926
p-value(0.0086)(0.0073)(0.007)
EP1;EP4-0.3274-0.3339-0.2936
p-value(0)(0)(0)
EP1;ITH-0.02320.01520.014
p-value(0.7734)(0.8508)(0.835)
EP2;EP30.09360.08260.074
p-value(0.2451)(0.3054)(0.2959)
EP2;EP4-0.2895-0.2784-0.2453
p-value(2e-04)(4e-04)(5e-04)
EP2;ITH0.05890.12640.1086
p-value(0.4654)(0.1159)(0.1024)
EP3;EP4-0.113-0.1167-0.1018
p-value(0.1602)(0.147)(0.133)
EP3;ITH-0.2236-0.2413-0.1977
p-value(0.005)(0.0024)(0.0021)
EP4;ITH0.12450.08960.0702
p-value(0.1216)(0.2661)(0.2697)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.50.50.5
0.020.50.50.5
0.030.50.50.5
0.040.50.50.5
0.050.50.50.5
0.060.50.50.5
0.070.50.50.5
0.080.50.50.5
0.090.50.50.5
0.10.50.50.5

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 0.5 & 0.5 & 0.5 \tabularnewline
0.02 & 0.5 & 0.5 & 0.5 \tabularnewline
0.03 & 0.5 & 0.5 & 0.5 \tabularnewline
0.04 & 0.5 & 0.5 & 0.5 \tabularnewline
0.05 & 0.5 & 0.5 & 0.5 \tabularnewline
0.06 & 0.5 & 0.5 & 0.5 \tabularnewline
0.07 & 0.5 & 0.5 & 0.5 \tabularnewline
0.08 & 0.5 & 0.5 & 0.5 \tabularnewline
0.09 & 0.5 & 0.5 & 0.5 \tabularnewline
0.1 & 0.5 & 0.5 & 0.5 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297601&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[ROW][C]0.02[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[ROW][C]0.03[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[ROW][C]0.04[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[ROW][C]0.05[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[ROW][C]0.06[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[ROW][C]0.07[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[ROW][C]0.08[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[ROW][C]0.09[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[ROW][C]0.1[/C][C]0.5[/C][C]0.5[/C][C]0.5[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297601&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297601&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.50.50.5
0.020.50.50.5
0.030.50.50.5
0.040.50.50.5
0.050.50.50.5
0.060.50.50.5
0.070.50.50.5
0.080.50.50.5
0.090.50.50.5
0.10.50.50.5



Parameters (Session):
par1 = kendall ;
Parameters (R input):
par1 = kendall ;
R code (references can be found in the software module):
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
x <- na.omit(x)
y <- t(na.omit(t(y)))
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
print(n)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')