{"title":"S02-01 Probabilistic Risk Assessment in Practice","authors":"A. Maertens","doi":"10.1016/j.toxlet.2025.07.046","DOIUrl":null,"url":null,"abstract":"<div><div>Recent developments in computational methods and <em>in vitro</em> models have created opportunities to enhance traditional toxicological risk assessment through probabilistic approaches that better reflect biological reality. Modern toxicology now recognizes that biological perturbations caused by chemicals are an inherently stochastic rather than deterministic processes. From initial molecular interactions to cellular signaling disruption, loss of homeostasis, organ dysfunction, and systemic disease, each stage involves probabilistic events rather than a linear cascade. This fundamental biological understanding necessitates a shift in regulatory approaches to hazard assessment. Probabilistic exposure assessments complement this approach by characterizing the full range of potential exposure scenarios across populations, rather than relying on single point estimates or depending on worst-case assumptions to be fully protective. Moreover, it is critical for regulators to distinguish between the full spectrum of assessment outcomes: high-confidence predictions of low risk that may justify reduced testing requirements, high-confidence predictions of high risk that warrant immediate regulatory action, low-confidence predictions suggesting uncertain risk that require additional scrutiny, and knowledge gaps that require additional data. When data gathering is warranted, regulatory toxicology needs to move away from a checklist approach and develop methods to prioritize data by thinking in terms of a “value-of-information” approach to mitigate uncertainty. How can regulatory frameworks adapt a probabilistic perspective? Key aspects include: (1) transitioning from categorical classifications to dose-dependent probability distributions; (2) establishing confidence thresholds for data-poor substances; (3) developing validation protocols that explicitly model biological variability; (4) addressing expertise gaps within regulatory agencies, especially in regards to uncertainty in artificial intelligence based models; and (5) creating standardized approaches for communicating biological uncertainty in regulatory contexts.</div></div>","PeriodicalId":23206,"journal":{"name":"Toxicology letters","volume":"411 ","pages":"Pages S15-S16"},"PeriodicalIF":2.9000,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Toxicology letters","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378427425016297","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"TOXICOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Recent developments in computational methods and in vitro models have created opportunities to enhance traditional toxicological risk assessment through probabilistic approaches that better reflect biological reality. Modern toxicology now recognizes that biological perturbations caused by chemicals are an inherently stochastic rather than deterministic processes. From initial molecular interactions to cellular signaling disruption, loss of homeostasis, organ dysfunction, and systemic disease, each stage involves probabilistic events rather than a linear cascade. This fundamental biological understanding necessitates a shift in regulatory approaches to hazard assessment. Probabilistic exposure assessments complement this approach by characterizing the full range of potential exposure scenarios across populations, rather than relying on single point estimates or depending on worst-case assumptions to be fully protective. Moreover, it is critical for regulators to distinguish between the full spectrum of assessment outcomes: high-confidence predictions of low risk that may justify reduced testing requirements, high-confidence predictions of high risk that warrant immediate regulatory action, low-confidence predictions suggesting uncertain risk that require additional scrutiny, and knowledge gaps that require additional data. When data gathering is warranted, regulatory toxicology needs to move away from a checklist approach and develop methods to prioritize data by thinking in terms of a “value-of-information” approach to mitigate uncertainty. How can regulatory frameworks adapt a probabilistic perspective? Key aspects include: (1) transitioning from categorical classifications to dose-dependent probability distributions; (2) establishing confidence thresholds for data-poor substances; (3) developing validation protocols that explicitly model biological variability; (4) addressing expertise gaps within regulatory agencies, especially in regards to uncertainty in artificial intelligence based models; and (5) creating standardized approaches for communicating biological uncertainty in regulatory contexts.