| entropy.shrink {entropy} | R Documentation |
entropy.shrink estimates the Shannon entropy H of the random variable Y
from the corresponding observed counts y by plug-in of shrinkage estimate
of the bin frequencies.
mi.shrink estimates the corresponding mutual information of two random variables.
freq.shrink estimates the bin frequencies from the counts y
using a James-Stein-type shrinkage estimator. The default shrinkage target is the uniform,
unless otherwise specified.
entropy.shrink(y, unit=c("log", "log2", "log10"), target=1/length(y), verbose=TRUE)
mi.shrink(y, unit=c("log", "log2", "log10"), target=1/length(y), verbose=TRUE)
freqs.shrink(y, target=1/length(y), verbose=TRUE)
y |
vector or matrix of counts. |
unit |
the unit in which entropy is measured. |
target |
the shrinkage target for the frequencies (default: uniform distribution). |
verbose |
report shrinkage intensity and equivalent pseudocount. |
The shrinkage estimator is a James-Stein-type estimator. It is essentially
a entropy.Dirichlet estimator, where the pseudocount is
estimated from the data.
For details see Hausser and Strimmer (2008).
entropy.shrink returns an estimate of the Shannon entropy.
freqs.shrink returns an estimtate of mutual information.
freqs.shrink returns the underlying frequencies.
In all instances the estimated shrinkage intensity is attached to the returned
value in the attribute lambda.freqs.
Korbinian Strimmer (http://strimmerlab.org).
Hausser, J., and K. Strimmer. 2008. Entropy inference and the James-Stein estimator. Preprint (see http://strimmerlab.org/publications/entropy2008.pdf).
entropy, entropy.Dirichlet,
entropy.NSB,
entropy.ChaoShen,
entropy.plugin, mi.plugin.
# load entropy library
library("entropy")
# observed counts for each bin
y = c(4, 2, 3, 0, 2, 4, 0, 0, 2, 1, 1)
# shrinkage estimate
entropy.shrink(y)
# contigency table with counts for two discrete variables
y = rbind( c(1,2,3), c(6,5,4) )
# shrinkage estimate of mutual information
mi.shrink(y)