{"title":"心率变异性的信息熵与维数计算","authors":"P.Z. Zhang, S. Reisman, W. Tapp, D. Cordero","doi":"10.1109/NEBC.1993.404438","DOIUrl":null,"url":null,"abstract":"An approach to construct the probability density curve (PDC) of a phase response curve (PRC), in which the PRC reflects directly heart rate fluctuations by phase shifts during different vagal stimuli, is presented. It was found that the width of the PDC usually represents the amount of heart rate variation. The narrow PDC with high peaks implies less heart rate variability, while the wide and flat PDC implies high heart rate variability. Based on the PDC, information entropy has been calculated. Information entropy can give the amount of information needed to specify the state of the PRC or the state of the heart rate variability to an accuracy of the box size. Current work has indicated that large information entropy means large heart rate variability and less predictability. On the other hand, information dimension has been derived from information entropies of the different box sizes. Information dimension is an index of the complexity of a system that generated the data. A low dimension means that less states are needed to describe the system while a large dimension means more states are required to describe the system.<<ETX>>","PeriodicalId":159783,"journal":{"name":"1993 IEEE Annual Northeast Bioengineering Conference","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1993-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Information entropy and dimension calculation on heart rate variability\",\"authors\":\"P.Z. Zhang, S. Reisman, W. Tapp, D. Cordero\",\"doi\":\"10.1109/NEBC.1993.404438\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"An approach to construct the probability density curve (PDC) of a phase response curve (PRC), in which the PRC reflects directly heart rate fluctuations by phase shifts during different vagal stimuli, is presented. It was found that the width of the PDC usually represents the amount of heart rate variation. The narrow PDC with high peaks implies less heart rate variability, while the wide and flat PDC implies high heart rate variability. Based on the PDC, information entropy has been calculated. Information entropy can give the amount of information needed to specify the state of the PRC or the state of the heart rate variability to an accuracy of the box size. Current work has indicated that large information entropy means large heart rate variability and less predictability. On the other hand, information dimension has been derived from information entropies of the different box sizes. Information dimension is an index of the complexity of a system that generated the data. A low dimension means that less states are needed to describe the system while a large dimension means more states are required to describe the system.<<ETX>>\",\"PeriodicalId\":159783,\"journal\":{\"name\":\"1993 IEEE Annual Northeast Bioengineering Conference\",\"volume\":\"26 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1993-03-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"1993 IEEE Annual Northeast Bioengineering Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NEBC.1993.404438\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"1993 IEEE Annual Northeast Bioengineering Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NEBC.1993.404438","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Information entropy and dimension calculation on heart rate variability
An approach to construct the probability density curve (PDC) of a phase response curve (PRC), in which the PRC reflects directly heart rate fluctuations by phase shifts during different vagal stimuli, is presented. It was found that the width of the PDC usually represents the amount of heart rate variation. The narrow PDC with high peaks implies less heart rate variability, while the wide and flat PDC implies high heart rate variability. Based on the PDC, information entropy has been calculated. Information entropy can give the amount of information needed to specify the state of the PRC or the state of the heart rate variability to an accuracy of the box size. Current work has indicated that large information entropy means large heart rate variability and less predictability. On the other hand, information dimension has been derived from information entropies of the different box sizes. Information dimension is an index of the complexity of a system that generated the data. A low dimension means that less states are needed to describe the system while a large dimension means more states are required to describe the system.<>