{"title":"DeepEPI:基于cnn -transformer的模型,通过预测增强子-启动子相互作用来提取TF相互作用。","authors":"Seyedeh Fatemeh Tabatabaei, Saeedeh Akbari Roknabadi, Somayyeh Koohi","doi":"10.1093/bioadv/vbaf221","DOIUrl":null,"url":null,"abstract":"<p><strong>Motivation: </strong>We introduce DeepEPI, a deep learning framework for studying enhancer-promoter interactions (EPIs) directly from genomic sequences. By integrating convolutional neural networks (CNNs) with Transformer blocks, DeepEPI captures the complex regulatory interplay between enhancers and promoters, a key factor in gene expression and disease mechanisms. The model emphasizes interpretability and efficiency by employing embedding layers for OneHot encoding and multihead attention for detecting and analyzing transcription factor (TF) interactions. A DNA2Vec-based version of DeepEPI is also evaluated.</p><p><strong>Results: </strong>DeepEPI is assessed in two dimensions: comparison with existing models and analysis of encoding methods. Across six cell lines, DeepEPI consistently outperforms prior approaches. Compared to EPIVAN, it achieves a 2.4% gain in area under the precision-recall curve (AUPR) and maintains AUROC with DNA2Vec encoding, while with OneHot encoding it shows a 4% increase in AUPR and 1.9% in AUROC. Regarding encoding, DNA2Vec provides higher accuracy, but our OneHot-based embedding balances competitive performance with interpretability and reduced storage requirements. Beyond prediction, DeepEPI enhances biological insight by extracting meaningful TF-TF interactions from attention heads, effectively narrowing the search space for experimental validation. Validation analyses further support the biological relevance of these findings, underscoring DeepEPI's value for advancing EPI research.</p><p><strong>Availability and implementation: </strong>The source code of DeepEPI is available at: https://github.com/nazanintbtb/DeepEPI.git.</p>","PeriodicalId":72368,"journal":{"name":"Bioinformatics advances","volume":"5 1","pages":"vbaf221"},"PeriodicalIF":2.8000,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12478696/pdf/","citationCount":"0","resultStr":"{\"title\":\"DeepEPI: CNN-transformer-based model for extracting TF interactions through predicting enhancer-promoter interactions.\",\"authors\":\"Seyedeh Fatemeh Tabatabaei, Saeedeh Akbari Roknabadi, Somayyeh Koohi\",\"doi\":\"10.1093/bioadv/vbaf221\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Motivation: </strong>We introduce DeepEPI, a deep learning framework for studying enhancer-promoter interactions (EPIs) directly from genomic sequences. By integrating convolutional neural networks (CNNs) with Transformer blocks, DeepEPI captures the complex regulatory interplay between enhancers and promoters, a key factor in gene expression and disease mechanisms. The model emphasizes interpretability and efficiency by employing embedding layers for OneHot encoding and multihead attention for detecting and analyzing transcription factor (TF) interactions. A DNA2Vec-based version of DeepEPI is also evaluated.</p><p><strong>Results: </strong>DeepEPI is assessed in two dimensions: comparison with existing models and analysis of encoding methods. Across six cell lines, DeepEPI consistently outperforms prior approaches. Compared to EPIVAN, it achieves a 2.4% gain in area under the precision-recall curve (AUPR) and maintains AUROC with DNA2Vec encoding, while with OneHot encoding it shows a 4% increase in AUPR and 1.9% in AUROC. Regarding encoding, DNA2Vec provides higher accuracy, but our OneHot-based embedding balances competitive performance with interpretability and reduced storage requirements. Beyond prediction, DeepEPI enhances biological insight by extracting meaningful TF-TF interactions from attention heads, effectively narrowing the search space for experimental validation. Validation analyses further support the biological relevance of these findings, underscoring DeepEPI's value for advancing EPI research.</p><p><strong>Availability and implementation: </strong>The source code of DeepEPI is available at: https://github.com/nazanintbtb/DeepEPI.git.</p>\",\"PeriodicalId\":72368,\"journal\":{\"name\":\"Bioinformatics advances\",\"volume\":\"5 1\",\"pages\":\"vbaf221\"},\"PeriodicalIF\":2.8000,\"publicationDate\":\"2025-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12478696/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Bioinformatics advances\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1093/bioadv/vbaf221\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICAL & COMPUTATIONAL BIOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Bioinformatics advances","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/bioadv/vbaf221","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
DeepEPI: CNN-transformer-based model for extracting TF interactions through predicting enhancer-promoter interactions.
Motivation: We introduce DeepEPI, a deep learning framework for studying enhancer-promoter interactions (EPIs) directly from genomic sequences. By integrating convolutional neural networks (CNNs) with Transformer blocks, DeepEPI captures the complex regulatory interplay between enhancers and promoters, a key factor in gene expression and disease mechanisms. The model emphasizes interpretability and efficiency by employing embedding layers for OneHot encoding and multihead attention for detecting and analyzing transcription factor (TF) interactions. A DNA2Vec-based version of DeepEPI is also evaluated.
Results: DeepEPI is assessed in two dimensions: comparison with existing models and analysis of encoding methods. Across six cell lines, DeepEPI consistently outperforms prior approaches. Compared to EPIVAN, it achieves a 2.4% gain in area under the precision-recall curve (AUPR) and maintains AUROC with DNA2Vec encoding, while with OneHot encoding it shows a 4% increase in AUPR and 1.9% in AUROC. Regarding encoding, DNA2Vec provides higher accuracy, but our OneHot-based embedding balances competitive performance with interpretability and reduced storage requirements. Beyond prediction, DeepEPI enhances biological insight by extracting meaningful TF-TF interactions from attention heads, effectively narrowing the search space for experimental validation. Validation analyses further support the biological relevance of these findings, underscoring DeepEPI's value for advancing EPI research.
Availability and implementation: The source code of DeepEPI is available at: https://github.com/nazanintbtb/DeepEPI.git.