Measures the uncertainty of a distribution.
The marginal or joint distribution.
The logarithm base to use for the calculations.
The entropy value.
Measures the uncertainty of a distribution conditional on one or more variables.
The marginal or joint distribution.
Any conditional variables. I.e. those on the right hand side of H(Y|X) when calculating conditional entropy.
The logarithm base to use for the calculations.
The entropy value.
Measures the uncertainty of a distribution conditional on one or more variables.
The marginal or joint distribution.
Any conditional variables. I.e. those on the right hand side of H(Y|X) when calculating conditional entropy.
The logarithm base to use for the calculations.
The entropy value.
Measures the uncertainty of a distribution.
The marginal or joint distribution.
The logarithm base to use for the calculations.
The entropy value.
Measures the uncertainty of a distribution conditional on one or more variables.
The marginal or joint distribution.
Any conditional variables. I.e. those on the right hand side of H(Y|X) when calculating conditional entropy.
The logarithm base to use for the calculations.
The entropy value.
Measures the uncertainty of a distribution.
The marginal or joint distribution.
The logarithm base to use for the calculations.
The entropy value.
Calculates entropy, joint entropy or conditional entropy, which can be used to determine the uncertainty in the states of a discrete distribution.
A higher values indicates less certainty about being in a particular state.