Hassan A Hassan, Aleksandar Anžel, Bahar İlgen, Marina Luchner, Patrick Schramowski, Anja Blasse, Johannes U Mayer, Katharina Ladewig, Georges Hattab, Maximilian Sprang
{"title":"A safety-centric perspective on innovation and risk in the use of artificial intelligence in genomics.","authors":"Hassan A Hassan, Aleksandar Anžel, Bahar İlgen, Marina Luchner, Patrick Schramowski, Anja Blasse, Johannes U Mayer, Katharina Ladewig, Georges Hattab, Maximilian Sprang","doi":"10.1016/j.tig.2026.04.001","DOIUrl":null,"url":null,"abstract":"<p><p>Adopting a safety-centric approach, this article explores how generative artificial intelligence (AI), and more specifically, foundation models for biological sequences, can exacerbate data quality issues, technical biases, and dual-use potential, particularly in critical applications such as clinical genetics, precision medicine, and pathogen engineering. This work centres on how misuse risks emerge throughout the innovation pipeline and how these intersect with the growing accessibility of generative genomic models. Particular attention is given to dual-use governance and infrastructure hardening in sequence analysis workflows. The work aims to provide scientists, regulators, and policymakers with a toolkit to discuss beneficial innovation in genomic AI while maintaining robust safeguards against harm and misuse.</p>","PeriodicalId":54413,"journal":{"name":"Trends in Genetics","volume":" ","pages":""},"PeriodicalIF":16.3000,"publicationDate":"2026-04-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Trends in Genetics","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.1016/j.tig.2026.04.001","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GENETICS & HEREDITY","Score":null,"Total":0}
引用次数: 0
Abstract
Adopting a safety-centric approach, this article explores how generative artificial intelligence (AI), and more specifically, foundation models for biological sequences, can exacerbate data quality issues, technical biases, and dual-use potential, particularly in critical applications such as clinical genetics, precision medicine, and pathogen engineering. This work centres on how misuse risks emerge throughout the innovation pipeline and how these intersect with the growing accessibility of generative genomic models. Particular attention is given to dual-use governance and infrastructure hardening in sequence analysis workflows. The work aims to provide scientists, regulators, and policymakers with a toolkit to discuss beneficial innovation in genomic AI while maintaining robust safeguards against harm and misuse.
期刊介绍:
Launched in 1985, Trends in Genetics swiftly established itself as a "must-read" for geneticists, offering concise, accessible articles covering a spectrum of topics from developmental biology to evolution. This reputation endures, making TiG a cherished resource in the genetic research community. While evolving with the field, the journal now embraces new areas like genomics, epigenetics, and computational genetics, alongside its continued coverage of traditional subjects such as transcriptional regulation, population genetics, and chromosome biology.
Despite expanding its scope, the core objective of TiG remains steadfast: to furnish researchers and students with high-quality, innovative reviews, commentaries, and discussions, fostering an appreciation for advances in genetic research. Each issue of TiG presents lively and up-to-date Reviews and Opinions, alongside shorter articles like Science & Society and Spotlight pieces. Invited from leading researchers, Reviews objectively chronicle recent developments, Opinions provide a forum for debate and hypothesis, and shorter articles explore the intersection of genetics with science and policy, as well as emerging ideas in the field. All articles undergo rigorous peer-review.