Compute information-based estimates and distances.
entropy(.data, .base = 2, .norm = F, .do.norm = NA, .laplace = 1e-12) kl_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12) js_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12, .norm.entropy = F) cross_entropy(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12, .norm.entropy = F)
.data | Numeric vector. Any distribution. |
---|---|
.base | Numeric. A base of logarithm. |
.norm | Logical. If TRUE then normalise the entropy by the maximal value of the entropy. |
.do.norm | If TRUE then normalise input distributions to make them sum up to 1. |
.laplace | Numeric. A value for the laplace correction. |
.alpha | Numeric vector. A distribution of some random value. |
.beta | Numeric vector. A distribution of some random value. |
.norm.entropy | Logical. If TRUE then normalise the resultant value by the average entropy of input distributions. |