Published September 13, 2022
| Version v1
Poster
Open
An Estimator for Conditional Mutual Information and Transfer Entropy on Metric Spaces
Description
- Conditional mutual information quantifies the conditional dependence of two random variables. It extends mutual information to condition on the information encoded in a third variable. It has numerous applications.
- It requires a lot of data to estimate accurately and suffers the curse of dimensionality, limiting its application in analysing electrophysiological data, as well as in data science and machine learning.
- Here, a Kozachenko-Leonenko approximation is used to derive a nearest neighbours estimator. This estimator is derived using only the metric structure of the data and depends only on the distance between data points and not on the dimension of the data. This is an extension of previous work to conditional MI.
- Control over the distance metric makes this method particularly applicable in neuroscience. By varying this, we have control over the implicit time scale, or in the case of neuron populations, looking at different population encodings
Files
BernsteinPoster_Witter.pdf
Files
(407.9 kB)
Name | Size | Download all |
---|---|---|
md5:49c1c00c50ec484f83e094af1efa7004
|
407.9 kB | Preview Download |