{"title":"Efficient Training of Transformers for Molecule Property Prediction on Small-scale Datasets","authors":"Shivesh Prakash","doi":"arxiv-2409.04909","DOIUrl":null,"url":null,"abstract":"The blood-brain barrier (BBB) serves as a protective barrier that separates\nthe brain from the circulatory system, regulating the passage of substances\ninto the central nervous system. Assessing the BBB permeability of potential\ndrugs is crucial for effective drug targeting. However, traditional\nexperimental methods for measuring BBB permeability are challenging and\nimpractical for large-scale screening. Consequently, there is a need to develop\ncomputational approaches to predict BBB permeability. This paper proposes a GPS\nTransformer architecture augmented with Self Attention, designed to perform\nwell in the low-data regime. The proposed approach achieved a state-of-the-art\nperformance on the BBB permeability prediction task using the BBBP dataset,\nsurpassing existing models. With a ROC-AUC of 78.8%, the approach sets a\nstate-of-the-art by 5.5%. We demonstrate that standard Self Attention coupled\nwith GPS transformer performs better than other variants of attention coupled\nwith GPS Transformer.","PeriodicalId":501266,"journal":{"name":"arXiv - QuanBio - Quantitative Methods","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Quantitative Methods","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.04909","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The blood-brain barrier (BBB) serves as a protective barrier that separates
the brain from the circulatory system, regulating the passage of substances
into the central nervous system. Assessing the BBB permeability of potential
drugs is crucial for effective drug targeting. However, traditional
experimental methods for measuring BBB permeability are challenging and
impractical for large-scale screening. Consequently, there is a need to develop
computational approaches to predict BBB permeability. This paper proposes a GPS
Transformer architecture augmented with Self Attention, designed to perform
well in the low-data regime. The proposed approach achieved a state-of-the-art
performance on the BBB permeability prediction task using the BBBP dataset,
surpassing existing models. With a ROC-AUC of 78.8%, the approach sets a
state-of-the-art by 5.5%. We demonstrate that standard Self Attention coupled
with GPS transformer performs better than other variants of attention coupled
with GPS Transformer.