Preprint Open Access

Comparative performance of mutual information and transfer entropy for analyzing the balance of information flow and energy consumption at synapses

Mireille Conrad; Renaud B Jolivet

Abstract: Information theory has become an essential tool of modern neuroscience. It can however be difficult to apply in experimental contexts when acquisition of very large datasets is prohibitive. Here, we compare the relative performance of two information theoretic measures, mutual information and transfer entropy, for the analysis of information flow and energetic consumption at synapses. We show that transfer entropy outperforms mutual information in terms of reliability of estimates for small datasets. However, we also show that a detailed understanding of the underlying neuronal biophysics is essential for properly interpreting the results obtained with transfer entropy. We conclude that when time and experimental conditions permit, mutual information might provide an easier to interpret alternative. Finally, we apply both measures to the study of energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures recapitulate the experimental finding that these synapses are tuned to optimally balance information flowing through them with the energetic consumption associated with that synaptic and neuronal activity. Our results highlight the importance of conducting systematic computational studies prior to applying information theoretic tools to experimental data.

Files (1.6 MB)
Name Size
2020.06.01.127399v1.full.pdf
md5:f0e1a8da04edce7613967c6afc405d36
1.6 MB Download
16
12
views
downloads
Views 16
Downloads 12
Data volume 18.8 MB
Unique views 11
Unique downloads 10

Share

Cite as