Python for information theoretic analysis of neural data

Standard

Python for information theoretic analysis of neural data. / Ince, Robin A A; Petersen, Rasmus S; Swan, Daniel C; Panzeri, Stefano.

in: FRONT NEUROINFORM, Jahrgang 3, 11.02.2009, S. 4.

Publikationen: SCORING: Beitrag in Fachzeitschrift/ZeitungSCORING: ZeitschriftenaufsatzForschungBegutachtung

Harvard

APA

Vancouver

Bibtex

@article{b21b9d9d783f439cbd658960b75d5145,
title = "Python for information theoretic analysis of neural data",
abstract = "Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources.",
author = "Ince, {Robin A A} and Petersen, {Rasmus S} and Swan, {Daniel C} and Stefano Panzeri",
year = "2009",
month = feb,
day = "11",
doi = "10.3389/neuro.11.004.2009",
language = "English",
volume = "3",
pages = "4",
journal = "FRONT NEUROINFORM",
issn = "1662-5196",
publisher = "Frontiers Research Foundation",

}

RIS

TY - JOUR

T1 - Python for information theoretic analysis of neural data

AU - Ince, Robin A A

AU - Petersen, Rasmus S

AU - Swan, Daniel C

AU - Panzeri, Stefano

PY - 2009/2/11

Y1 - 2009/2/11

N2 - Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources.

AB - Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources.

U2 - 10.3389/neuro.11.004.2009

DO - 10.3389/neuro.11.004.2009

M3 - SCORING: Journal article

C2 - 19242557

VL - 3

SP - 4

JO - FRONT NEUROINFORM

JF - FRONT NEUROINFORM

SN - 1662-5196

ER -