Inferring single-cell resolution spatial gene expression via fusing spot-based spatial transcriptomics, location, and histology using GCN.

IF 6.8 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS
Shuailin Xue, Fangfang Zhu, Jinyu Chen, Wenwen Min
{"title":"Inferring single-cell resolution spatial gene expression via fusing spot-based spatial transcriptomics, location, and histology using GCN.","authors":"Shuailin Xue, Fangfang Zhu, Jinyu Chen, Wenwen Min","doi":"10.1093/bib/bbae630","DOIUrl":null,"url":null,"abstract":"<p><p>Spatial transcriptomics (ST technology allows for the detection of cellular transcriptome information while preserving the spatial location of cells. This capability enables researchers to better understand the cellular heterogeneity, spatial organization, and functional interactions in complex biological systems. However, current technological methods are limited by low resolution, which reduces the accuracy of gene expression levels. Here, we propose scstGCN, a multimodal information fusion method based on Vision Transformer and Graph Convolutional Network that integrates histological images, spot-based ST data and spatial location information to infer super-resolution gene expression profiles at single-cell level. We evaluated the accuracy of the super-resolution gene expression profiles generated on diverse tissue ST datasets with disease and healthy by scstGCN along with their performance in identifying spatial patterns, conducting functional enrichment analysis, and tissue annotation. The results show that scstGCN can predict super-resolution gene expression accurately and aid researchers in discovering biologically meaningful differentially expressed genes and pathways. Additionally, scstGCN can segment and annotate tissues at a finer granularity, with results demonstrating strong consistency with coarse manual annotations. Our source code and all used datasets are available at https://github.com/wenwenmin/scstGCN and https://zenodo.org/records/12800375.</p>","PeriodicalId":9209,"journal":{"name":"Briefings in bioinformatics","volume":"26 1","pages":""},"PeriodicalIF":6.8000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11645551/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Briefings in bioinformatics","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.1093/bib/bbae630","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

Spatial transcriptomics (ST technology allows for the detection of cellular transcriptome information while preserving the spatial location of cells. This capability enables researchers to better understand the cellular heterogeneity, spatial organization, and functional interactions in complex biological systems. However, current technological methods are limited by low resolution, which reduces the accuracy of gene expression levels. Here, we propose scstGCN, a multimodal information fusion method based on Vision Transformer and Graph Convolutional Network that integrates histological images, spot-based ST data and spatial location information to infer super-resolution gene expression profiles at single-cell level. We evaluated the accuracy of the super-resolution gene expression profiles generated on diverse tissue ST datasets with disease and healthy by scstGCN along with their performance in identifying spatial patterns, conducting functional enrichment analysis, and tissue annotation. The results show that scstGCN can predict super-resolution gene expression accurately and aid researchers in discovering biologically meaningful differentially expressed genes and pathways. Additionally, scstGCN can segment and annotate tissues at a finer granularity, with results demonstrating strong consistency with coarse manual annotations. Our source code and all used datasets are available at https://github.com/wenwenmin/scstGCN and https://zenodo.org/records/12800375.

利用 GCN 将基于点的空间转录组学、位置和组织学融合在一起,推断单细胞分辨率的空间基因表达。
空间转录组学(ST)技术可以检测细胞转录组信息,同时保留细胞的空间位置。这种能力使研究人员能够更好地了解复杂生物系统中的细胞异质性、空间组织和功能相互作用。然而,目前的技术方法受限于低分辨率,降低了基因表达水平的准确性。在这里,我们提出了一种基于视觉变换器和图卷积网络的多模态信息融合方法--scstGCN,它整合了组织学图像、基于斑点的 ST 数据和空间位置信息,以推断单细胞水平的超分辨率基因表达谱。我们评估了 scstGCN 在不同组织 ST 数据集上生成的疾病和健康超分辨率基因表达谱的准确性,以及它们在识别空间模式、进行功能富集分析和组织注释方面的性能。结果表明,scstGCN 可以准确预测超分辨率基因表达,帮助研究人员发现具有生物学意义的差异表达基因和通路。此外,scstGCN 还能以更细的粒度分割和注释组织,其结果与粗略的人工注释结果具有很强的一致性。我们的源代码和所有使用的数据集可在 https://github.com/wenwenmin/scstGCN 和 https://zenodo.org/records/12800375 上获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Briefings in bioinformatics
Briefings in bioinformatics 生物-生化研究方法
CiteScore
13.20
自引率
13.70%
发文量
549
审稿时长
6 months
期刊介绍: Briefings in Bioinformatics is an international journal serving as a platform for researchers and educators in the life sciences. It also appeals to mathematicians, statisticians, and computer scientists applying their expertise to biological challenges. The journal focuses on reviews tailored for users of databases and analytical tools in contemporary genetics, molecular and systems biology. It stands out by offering practical assistance and guidance to non-specialists in computerized methodologies. Covering a wide range from introductory concepts to specific protocols and analyses, the papers address bacterial, plant, fungal, animal, and human data. The journal's detailed subject areas include genetic studies of phenotypes and genotypes, mapping, DNA sequencing, expression profiling, gene expression studies, microarrays, alignment methods, protein profiles and HMMs, lipids, metabolic and signaling pathways, structure determination and function prediction, phylogenetic studies, and education and training.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信