Entropy |
Calculates entropy, joint entropy or conditional entropy, which can be used to determine the uncertainty in the states of a discrete distribution.
|
IntervalStatistics |
Calculates statistics such as mean and variance for discretized variables, i.e.
|
JensenShannon |
Methods for computing the Jensen Shannon divergence, which measures the similarity between probability distributions.
|
KullbackLeibler |
Calculate the Kullback–Leibler divergence between 2 distributions with the same variables, D(P||Q).
|
MutualInformation |
Calculates mutual information or conditional mutual information, which measures the dependence between two variables.
|