GE-IA-NAM:基于图像辅助神经加性模型的基因-环境相互作用分析。

IF 5.4
Jingmao Li, Yaqing Xu, Shuangge Ma, Kuangnan Fang
{"title":"GE-IA-NAM:基于图像辅助神经加性模型的基因-环境相互作用分析。","authors":"Jingmao Li, Yaqing Xu, Shuangge Ma, Kuangnan Fang","doi":"10.1093/bioinformatics/btaf481","DOIUrl":null,"url":null,"abstract":"<p><strong>Motivation: </strong>Gene-environment (G-E) interaction analysis is crucial in cancer research, offering insights into how genetic and environmental factors jointly influence cancer outcomes. Most existing G-E interaction methods are regression-based, which may lack flexibility to capture complex data patterns. Recent advances have investigated deep neural network-based G-E models. However, these methods may be more vulnerable to information deficiency due to challenges such as limited sample size and high dimensionality. Apart from genetic and environmental data, pathological images have emerged as a widely accessible and informative resource for cancer modeling, presenting its potential to enhance G-E modeling.</p><p><strong>Results: </strong>We propose the pathological imaging-assisted neural additive model for G-E analysis (GE-IA-NAM). The flexible and interpretable additive network architecture is adopted to account for individualized effects associated with genetic factors, environmental factors, and their interactions. To improve G-E modeling, an assisted-learning strategy is investigated, which adopts a joint analysis to integrate information from pathological images. Simulations and the analysis of lung and skin cancer datasets from The Cancer Genome Atlas demonstrate the competitive performance of the proposed method.</p><p><strong>Availability and implementation: </strong>Python code implementing the proposed method is available at https://github.com/Mr-maoge/NAM-IA-GE. The data that support the findings in this article are openly available in TCGA (The Cancer Genome Atlas) at https://portal.gdc.cancer.gov/.</p>","PeriodicalId":93899,"journal":{"name":"Bioinformatics (Oxford, England)","volume":" ","pages":""},"PeriodicalIF":5.4000,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12452269/pdf/","citationCount":"0","resultStr":"{\"title\":\"GE-IA-NAM: gene-environment interaction analysis via imaging-assisted neural additive model.\",\"authors\":\"Jingmao Li, Yaqing Xu, Shuangge Ma, Kuangnan Fang\",\"doi\":\"10.1093/bioinformatics/btaf481\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Motivation: </strong>Gene-environment (G-E) interaction analysis is crucial in cancer research, offering insights into how genetic and environmental factors jointly influence cancer outcomes. Most existing G-E interaction methods are regression-based, which may lack flexibility to capture complex data patterns. Recent advances have investigated deep neural network-based G-E models. However, these methods may be more vulnerable to information deficiency due to challenges such as limited sample size and high dimensionality. Apart from genetic and environmental data, pathological images have emerged as a widely accessible and informative resource for cancer modeling, presenting its potential to enhance G-E modeling.</p><p><strong>Results: </strong>We propose the pathological imaging-assisted neural additive model for G-E analysis (GE-IA-NAM). The flexible and interpretable additive network architecture is adopted to account for individualized effects associated with genetic factors, environmental factors, and their interactions. To improve G-E modeling, an assisted-learning strategy is investigated, which adopts a joint analysis to integrate information from pathological images. Simulations and the analysis of lung and skin cancer datasets from The Cancer Genome Atlas demonstrate the competitive performance of the proposed method.</p><p><strong>Availability and implementation: </strong>Python code implementing the proposed method is available at https://github.com/Mr-maoge/NAM-IA-GE. The data that support the findings in this article are openly available in TCGA (The Cancer Genome Atlas) at https://portal.gdc.cancer.gov/.</p>\",\"PeriodicalId\":93899,\"journal\":{\"name\":\"Bioinformatics (Oxford, England)\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":5.4000,\"publicationDate\":\"2025-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12452269/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Bioinformatics (Oxford, England)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1093/bioinformatics/btaf481\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Bioinformatics (Oxford, England)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/bioinformatics/btaf481","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

动机:基因-环境(G-E)相互作用分析在癌症研究中至关重要,可以深入了解遗传和环境因素如何共同影响癌症结果。大多数现有的G-E交互方法是基于回归的,这可能缺乏捕获复杂数据模式的灵活性。最近的研究进展是基于深度神经网络的G-E模型。然而,这些方法由于样本量有限、维度高等挑战,更容易受到信息不足的影响。除了遗传和环境数据外,病理图像已成为癌症建模的广泛可获取和信息资源,显示出其增强G-E建模的潜力。结果:我们提出了病理成像辅助的G-E分析神经相加模型(GE-IA-NAM)。采用灵活且可解释的附加网络结构来解释遗传因素、环境因素及其相互作用的个性化影响。为了改进G-E建模,研究了一种辅助学习策略,该策略采用联合分析来整合病理图像信息。对来自癌症基因组图谱的肺癌和皮肤癌数据集的仿真和分析证明了该方法的竞争性性能。可用性和实现:实现所建议方法的Python代码可在https://github.com/Mr-maoge/NAM-IA-GE获得。支持本文发现的数据可在TCGA(癌症基因组图谱)上公开获取,网址为https://portal.gdc.cancer.gov/.Supplementary information:补充材料可在Bioinformatics在线获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
GE-IA-NAM: gene-environment interaction analysis via imaging-assisted neural additive model.

Motivation: Gene-environment (G-E) interaction analysis is crucial in cancer research, offering insights into how genetic and environmental factors jointly influence cancer outcomes. Most existing G-E interaction methods are regression-based, which may lack flexibility to capture complex data patterns. Recent advances have investigated deep neural network-based G-E models. However, these methods may be more vulnerable to information deficiency due to challenges such as limited sample size and high dimensionality. Apart from genetic and environmental data, pathological images have emerged as a widely accessible and informative resource for cancer modeling, presenting its potential to enhance G-E modeling.

Results: We propose the pathological imaging-assisted neural additive model for G-E analysis (GE-IA-NAM). The flexible and interpretable additive network architecture is adopted to account for individualized effects associated with genetic factors, environmental factors, and their interactions. To improve G-E modeling, an assisted-learning strategy is investigated, which adopts a joint analysis to integrate information from pathological images. Simulations and the analysis of lung and skin cancer datasets from The Cancer Genome Atlas demonstrate the competitive performance of the proposed method.

Availability and implementation: Python code implementing the proposed method is available at https://github.com/Mr-maoge/NAM-IA-GE. The data that support the findings in this article are openly available in TCGA (The Cancer Genome Atlas) at https://portal.gdc.cancer.gov/.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信