{"title":"On deep learning for computing the dynamic initial margin and margin value adjustment","authors":"Joel P. Villarino , Alvaro Leitao","doi":"10.1016/j.amc.2025.129679","DOIUrl":null,"url":null,"abstract":"<div><div>The present work addresses the challenge of training neural networks for Dynamic Initial Margin (DIM) computation in counterparty credit risk, a task traditionally burdened by the high costs associated with generating training datasets through nested Monte Carlo (MC) simulations. By condensing the initial market state variables into an input vector, determined through an interest rate model and a parsimonious parameterization of the current interest rate term structure, we construct a training dataset where the labels are future realizations, generated with a single MC path, of the Initial Margin (IM) variable. Since DIM is defined as the conditional expectation of IM, the latter can be understood as noisy and unbiased samples of DIM, allowing the application of deep learning regression techniques to its computation. To this end, a multi-output neural network structure is employed to handle DIM as a time-dependent function, facilitating training across a mesh of monitoring times. This methodology offers significant advantages: it reduces the dataset generation cost to a single MC execution and parameterizes the neural network by initial market state variables, obviating the need for repeated training. Experimental results demonstrate the approach’s convergence properties and robustness across different interest rate models (Hull-White and Cox-Ingersoll-Ross) and portfolio complexities, validating its general applicability and efficiency in more realistic scenarios.</div></div>","PeriodicalId":55496,"journal":{"name":"Applied Mathematics and Computation","volume":"510 ","pages":"Article 129679"},"PeriodicalIF":3.4000,"publicationDate":"2025-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Mathematics and Computation","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0096300325004059","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
The present work addresses the challenge of training neural networks for Dynamic Initial Margin (DIM) computation in counterparty credit risk, a task traditionally burdened by the high costs associated with generating training datasets through nested Monte Carlo (MC) simulations. By condensing the initial market state variables into an input vector, determined through an interest rate model and a parsimonious parameterization of the current interest rate term structure, we construct a training dataset where the labels are future realizations, generated with a single MC path, of the Initial Margin (IM) variable. Since DIM is defined as the conditional expectation of IM, the latter can be understood as noisy and unbiased samples of DIM, allowing the application of deep learning regression techniques to its computation. To this end, a multi-output neural network structure is employed to handle DIM as a time-dependent function, facilitating training across a mesh of monitoring times. This methodology offers significant advantages: it reduces the dataset generation cost to a single MC execution and parameterizes the neural network by initial market state variables, obviating the need for repeated training. Experimental results demonstrate the approach’s convergence properties and robustness across different interest rate models (Hull-White and Cox-Ingersoll-Ross) and portfolio complexities, validating its general applicability and efficiency in more realistic scenarios.
期刊介绍:
Applied Mathematics and Computation addresses work at the interface between applied mathematics, numerical computation, and applications of systems – oriented ideas to the physical, biological, social, and behavioral sciences, and emphasizes papers of a computational nature focusing on new algorithms, their analysis and numerical results.
In addition to presenting research papers, Applied Mathematics and Computation publishes review articles and single–topics issues.