|
Class Summary |
| BrayCurtis |
BrayCurtis(X,Y) = 1-2*W/(sum(X)+sum(Y)), with W = sum_i[ min(x_i,y_i)] |
| BrownCorrelation |
Jump up to: a b c Szkely, Rizzo and Bakirov (2007)
Jump up to: a b c d Szkely & Rizzo (2009)
http://en.wikipedia.org/wiki/Distance_correlation#Distance_correlation
The used strategy to handle missing value is to evaluate values with all available value (estimate means)
and try to infer the covariance |
| Entropy |
Abstract class of entropy that need to use an estimator. |
| Euclidean |
|
| Hellinger |
|
| HilbertSchmidt |
Compute the Hilbert-Schmidt independence criterion according to
the estimate in the paper "On Kernel Parameter Selection in Hilbert-Schmidt Independence Criterion" p.3 |
| JensenShannon |
|
| Kendall |
Using Tau-b (adjustments for ties)
Knight's Algorithm |
| KullbackLeibler |
Symmetric case ( [ KLD(P|Q) + KLD(Q|P) ] /2 ) |
| MutualInformation |
Mutual information that use the entropy formula:
I(X,Y) = H(X) - H(X|Y) = H(Y) - H(Y|X) = H(X) + H(Y) - H(X,Y) |
| Pearson |
|
| Spearman |
|
| Steinhaus |
Steinhaus(X,Y) = 2*W/(sum(X)+sum(Y)), with W = sum_i[ min(x_i,y_i)] |
| VarianceOfLogRatios |
Variance of log ratios scaled to [0;1]: 1-exp(-sqrt(D(x,y))) according to Aitchison
where D(X,Y) is the Variance of log ratios. |