Utku Erdoğan , Şahin Işık , Yıldıray Anagün , Gabriel Lord
{"title":"ExpTamed:一个基于Langevin SDEs的指数驯服优化器","authors":"Utku Erdoğan , Şahin Işık , Yıldıray Anagün , Gabriel Lord","doi":"10.1016/j.neucom.2025.130949","DOIUrl":null,"url":null,"abstract":"<div><div>This study presents a new method to improve optimization by regularizing the gradients in deep learning methods based on a novel taming strategy to regulate the growth of numerical solutions for stochastic differential equations. The method, ExpTamed, enhances stability and reduces the mean-square error across a short time horizon in comparison to existing techniques. The practical effectiveness of ExpTamed is rigorously evaluated on CIFAR-10, Tiny-ImageNet, and Caltech256 across diverse architectures. In direct comparisons with prominent optimizers like Adam, ExpTamed demonstrates significant performance gains. Specifically, it achieved increases in best top-1 test accuracy ranging from 0.86 to 2.76 percentage points on CIFAR-10, and up to 4.46 percentage points on Tiny-ImageNet (without learning rate schedule). On Caltech256, ExpTamed also yielded superior accuracy, precision, and Kappa metrics. These results clearly quantify ExpTamed’s capability to deliver enhanced performance in practical deep learning applications.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"651 ","pages":"Article 130949"},"PeriodicalIF":5.5000,"publicationDate":"2025-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ExpTamed: An exponential tamed optimizer based on Langevin SDEs\",\"authors\":\"Utku Erdoğan , Şahin Işık , Yıldıray Anagün , Gabriel Lord\",\"doi\":\"10.1016/j.neucom.2025.130949\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>This study presents a new method to improve optimization by regularizing the gradients in deep learning methods based on a novel taming strategy to regulate the growth of numerical solutions for stochastic differential equations. The method, ExpTamed, enhances stability and reduces the mean-square error across a short time horizon in comparison to existing techniques. The practical effectiveness of ExpTamed is rigorously evaluated on CIFAR-10, Tiny-ImageNet, and Caltech256 across diverse architectures. In direct comparisons with prominent optimizers like Adam, ExpTamed demonstrates significant performance gains. Specifically, it achieved increases in best top-1 test accuracy ranging from 0.86 to 2.76 percentage points on CIFAR-10, and up to 4.46 percentage points on Tiny-ImageNet (without learning rate schedule). On Caltech256, ExpTamed also yielded superior accuracy, precision, and Kappa metrics. These results clearly quantify ExpTamed’s capability to deliver enhanced performance in practical deep learning applications.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"651 \",\"pages\":\"Article 130949\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2025-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231225016212\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225016212","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
ExpTamed: An exponential tamed optimizer based on Langevin SDEs
This study presents a new method to improve optimization by regularizing the gradients in deep learning methods based on a novel taming strategy to regulate the growth of numerical solutions for stochastic differential equations. The method, ExpTamed, enhances stability and reduces the mean-square error across a short time horizon in comparison to existing techniques. The practical effectiveness of ExpTamed is rigorously evaluated on CIFAR-10, Tiny-ImageNet, and Caltech256 across diverse architectures. In direct comparisons with prominent optimizers like Adam, ExpTamed demonstrates significant performance gains. Specifically, it achieved increases in best top-1 test accuracy ranging from 0.86 to 2.76 percentage points on CIFAR-10, and up to 4.46 percentage points on Tiny-ImageNet (without learning rate schedule). On Caltech256, ExpTamed also yielded superior accuracy, precision, and Kappa metrics. These results clearly quantify ExpTamed’s capability to deliver enhanced performance in practical deep learning applications.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.