HistoSPACE: Histology-inspired spatial transcriptome prediction and characterization engine

IF 4.2 3区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS
Shivam Kumar, Samrat Chatterjee
{"title":"HistoSPACE: Histology-inspired spatial transcriptome prediction and characterization engine","authors":"Shivam Kumar,&nbsp;Samrat Chatterjee","doi":"10.1016/j.ymeth.2024.11.002","DOIUrl":null,"url":null,"abstract":"<div><div>Spatial transcriptomics (ST) enables the visualization of gene expression within the context of tissue morphology. This emerging discipline has the potential to serve as a foundation for developing tools to design precision medicines. However, due to the higher costs and expertise required for such experiments, its translation into a regular clinical practice might be challenging. Despite implementing modern deep learning to enhance information obtained from histological images using AI, efforts have been constrained by limitations in the diversity of information. In this paper, we developed a model, HistoSPACE, that explores the diversity of histological images available with ST data to extract molecular insights from tissue images. Further, our approach allows us to link the predicted expression with disease pathology. Our proposed study built an image encoder derived from a universal image autoencoder. This image encoder was connected to convolution blocks to build the final model. It was further fine-tuned with the help of ST-Data. The number of model parameters is small and requires lesser system memory and relatively lesser training time. Making it lightweight in comparison to traditional histological models. Our developed model demonstrates significant efficiency compared to contemporary algorithms, revealing a correlation of 0.56 in leave-one-out cross-validation. Finally, its robustness was validated through an independent dataset, showing similar prediction with predefined disease pathology. Our code is available at <span><span>https://github.com/samrat-lab/HistoSPACE</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":390,"journal":{"name":"Methods","volume":"232 ","pages":"Pages 107-114"},"PeriodicalIF":4.2000,"publicationDate":"2024-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Methods","FirstCategoryId":"99","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1046202324002391","RegionNum":3,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

Spatial transcriptomics (ST) enables the visualization of gene expression within the context of tissue morphology. This emerging discipline has the potential to serve as a foundation for developing tools to design precision medicines. However, due to the higher costs and expertise required for such experiments, its translation into a regular clinical practice might be challenging. Despite implementing modern deep learning to enhance information obtained from histological images using AI, efforts have been constrained by limitations in the diversity of information. In this paper, we developed a model, HistoSPACE, that explores the diversity of histological images available with ST data to extract molecular insights from tissue images. Further, our approach allows us to link the predicted expression with disease pathology. Our proposed study built an image encoder derived from a universal image autoencoder. This image encoder was connected to convolution blocks to build the final model. It was further fine-tuned with the help of ST-Data. The number of model parameters is small and requires lesser system memory and relatively lesser training time. Making it lightweight in comparison to traditional histological models. Our developed model demonstrates significant efficiency compared to contemporary algorithms, revealing a correlation of 0.56 in leave-one-out cross-validation. Finally, its robustness was validated through an independent dataset, showing similar prediction with predefined disease pathology. Our code is available at https://github.com/samrat-lab/HistoSPACE.
HistoSPACE:受组织学启发的空间转录组预测和表征引擎。
空间转录组学(ST)可将组织形态背景下的基因表达可视化。这门新兴学科有望成为开发精准药物设计工具的基础。然而,由于此类实验需要较高的成本和专业知识,将其转化为常规临床实践可能具有挑战性。尽管采用了现代深度学习技术来利用人工智能增强从组织学图像中获取的信息,但由于信息多样性的限制,这方面的努力一直受到制约。在本文中,我们开发了一个名为 "HistoSPACE "的模型,利用 ST 数据探索组织学图像的多样性,从组织图像中提取分子信息。此外,我们的方法还能将预测表达与疾病病理联系起来。我们提出的研究建立了一个源自通用图像自动编码器的图像编码器。该图像编码器与卷积块相连,以建立最终模型。在 ST-Data 的帮助下,对其进行了进一步的微调。模型参数数量少,所需的系统内存和训练时间也相对较少。与传统的组织学模型相比,该模型更轻便。与当代算法相比,我们开发的模型具有显著的效率,在留空交叉验证中显示出 0.56 的相关性。最后,我们通过一个独立的数据集验证了该模型的鲁棒性,显示出与预定义疾病病理相似的预测结果。我们的代码见 https://github.com/samrat-lab/HistoSPACE。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Methods
Methods 生物-生化研究方法
CiteScore
9.80
自引率
2.10%
发文量
222
审稿时长
11.3 weeks
期刊介绍: Methods focuses on rapidly developing techniques in the experimental biological and medical sciences. Each topical issue, organized by a guest editor who is an expert in the area covered, consists solely of invited quality articles by specialist authors, many of them reviews. Issues are devoted to specific technical approaches with emphasis on clear detailed descriptions of protocols that allow them to be reproduced easily. The background information provided enables researchers to understand the principles underlying the methods; other helpful sections include comparisons of alternative methods giving the advantages and disadvantages of particular methods, guidance on avoiding potential pitfalls, and suggestions for troubleshooting.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信