{"title":"量化神经活动整合的信息论方法","authors":"Selin Aviyente","doi":"10.1109/ITA.2007.4357556","DOIUrl":null,"url":null,"abstract":"In recent years, there has been a growing interest in quantifying the interaction and integration between different neuronal activities in the brain. One problem of interest has been to quantify how different neuronal sites communicate with each other. For this purpose, different measures of functional integration such as spectral coherence, phase synchrony and mutual information have been proposed. In this paper, we introduce information-theoretic measures such as entropy and divergence to quantify the interaction between different neuronal sites. The information- theoretic measures introduced in this paper are adapted to the time-frequency domain to account for the dynamic nature of neuronal activity. Time-frequency distributions are two-dimensional energy density functions of time and frequency, and can be treated in a way similar to probability density functions. Since time-frequency distributions are not always positive, information measures such as Renyi entropy and Jensen-Renyi divergence are adapted to this new domain instead of the well-known Shannon entropy. In this paper, we first discuss some properties of these modified measures and then illustrate their application to neural signals. The proposed measures are applied to multiple electrode recordings of electroencephalogram (EEG) data to quantify the interaction between different neuronal sites and between different cognitive states.","PeriodicalId":439952,"journal":{"name":"2007 Information Theory and Applications Workshop","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Information Theoretic Measures for Quantifying the Integration of Neural Activity\",\"authors\":\"Selin Aviyente\",\"doi\":\"10.1109/ITA.2007.4357556\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, there has been a growing interest in quantifying the interaction and integration between different neuronal activities in the brain. One problem of interest has been to quantify how different neuronal sites communicate with each other. For this purpose, different measures of functional integration such as spectral coherence, phase synchrony and mutual information have been proposed. In this paper, we introduce information-theoretic measures such as entropy and divergence to quantify the interaction between different neuronal sites. The information- theoretic measures introduced in this paper are adapted to the time-frequency domain to account for the dynamic nature of neuronal activity. Time-frequency distributions are two-dimensional energy density functions of time and frequency, and can be treated in a way similar to probability density functions. Since time-frequency distributions are not always positive, information measures such as Renyi entropy and Jensen-Renyi divergence are adapted to this new domain instead of the well-known Shannon entropy. In this paper, we first discuss some properties of these modified measures and then illustrate their application to neural signals. The proposed measures are applied to multiple electrode recordings of electroencephalogram (EEG) data to quantify the interaction between different neuronal sites and between different cognitive states.\",\"PeriodicalId\":439952,\"journal\":{\"name\":\"2007 Information Theory and Applications Workshop\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-10-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2007 Information Theory and Applications Workshop\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ITA.2007.4357556\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 Information Theory and Applications Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITA.2007.4357556","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Information Theoretic Measures for Quantifying the Integration of Neural Activity
In recent years, there has been a growing interest in quantifying the interaction and integration between different neuronal activities in the brain. One problem of interest has been to quantify how different neuronal sites communicate with each other. For this purpose, different measures of functional integration such as spectral coherence, phase synchrony and mutual information have been proposed. In this paper, we introduce information-theoretic measures such as entropy and divergence to quantify the interaction between different neuronal sites. The information- theoretic measures introduced in this paper are adapted to the time-frequency domain to account for the dynamic nature of neuronal activity. Time-frequency distributions are two-dimensional energy density functions of time and frequency, and can be treated in a way similar to probability density functions. Since time-frequency distributions are not always positive, information measures such as Renyi entropy and Jensen-Renyi divergence are adapted to this new domain instead of the well-known Shannon entropy. In this paper, we first discuss some properties of these modified measures and then illustrate their application to neural signals. The proposed measures are applied to multiple electrode recordings of electroencephalogram (EEG) data to quantify the interaction between different neuronal sites and between different cognitive states.