{"title":"Contagion-induced risk: An application to the global export network","authors":"E. Vicente, A. Mateos, E. Mateos","doi":"10.1016/j.jcmds.2021.100010","DOIUrl":"https://doi.org/10.1016/j.jcmds.2021.100010","url":null,"abstract":"","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"54 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76131897","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Testing pairs of continuous random variables for independence: A simple heuristic","authors":"M. Khatun, S. Siddiqui","doi":"10.1016/j.jcmds.2021.100012","DOIUrl":"https://doi.org/10.1016/j.jcmds.2021.100012","url":null,"abstract":"","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"14 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86912175","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christophe Chesneau , M. Girish Babu , Hassan S. Bakouch
{"title":"The Yun transform in probabilistic and statistical contexts: Weibull baseline case and its applications in reliability theory","authors":"Christophe Chesneau , M. Girish Babu , Hassan S. Bakouch","doi":"10.1016/j.jcmds.2021.100002","DOIUrl":"https://doi.org/10.1016/j.jcmds.2021.100002","url":null,"abstract":"<div><p>In this paper, we present a new family of distributions based on a particular case of a transform introduced by Yun (2014). Among others, this transform demonstrates great flexibility and nice mathematical properties which can be useful in a statistical context (continuous derivatives of all order, simplicity of the inverse transform, etc.). We propose a new three-parameter distribution from this family, namely the Yun–Weibull (YW) distribution. Some statistical properties of this distribution are studied, involving flexible hazard rate shapes. Subsequently, the statistical inference of the YW distribution is investigated. The parameters are estimated by employing the maximum likelihood estimation method. We establish the existence and uniqueness of the obtained estimators. The YW distribution is applied to fit two practical data sets. As a main result of our analysis, the new distribution is found to be more appropriate to these data sets than other competitive distributions. Moreover, the uniqueness of the parameter estimates of the YW distribution is studied using the profile log-likelihood function visually under the two practical data sets.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"1 ","pages":"Article 100002"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.jcmds.2021.100002","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91678001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Learning 2D Gabor filters by infinite kernel learning regression","authors":"Kamaledin Ghiasi-Shirazi","doi":"10.1016/j.jcmds.2021.100016","DOIUrl":"https://doi.org/10.1016/j.jcmds.2021.100016","url":null,"abstract":"<div><p>Gabor functions have wide-spread applications both in analyzing the visual cortex of mammalians and in designing machine vision algorithms. It is known that the receptive field of neurons of V1 layer in the visual cortex can be accurately modeled by Gabor functions. In addition, Gabor functions are extensively used for feature extraction in machine vision tasks. In this paper, we prove that Gabor functions are translation-invariant positive-definite kernels and show that the problem of image representation with Gabor functions can be formulated as infinite kernel learning regression. Specifically, we use the stabilized infinite kernel learning regression algorithm that has already been introduced for learning translation-invariant positive-definite kernels and has enough flexibility and generality to embrace the class of Gabor kernels. The algorithm yields a representation of the image as a support vector expansion with a compound kernel that is a finite mixture of Gabor functions. The problem with this representation is that all Gabor functions are present at all support vector pixels. Using LASSO, we propose a method for sparse representation of an image with Gabor functions in which each Gabor function is positioned at a very sparse set of pixels. As a practical application, we introduce a novel method for learning a dataset-specific set of Gabor filters that can be used subsequently for feature extraction. Our experiments on CMU-PIE and Extended Yale B datasets show that use of the learned Gabor filters significantly improves the recognition accuracy of a recently introduced face recognition algorithm.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"1 ","pages":"Article 100016"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772415821000080/pdfft?md5=af240b6063e9a7317487e5c2e4f5c43f&pid=1-s2.0-S2772415821000080-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91678000","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Neural networks as smooth priors for inverse problems for PDEs","authors":"J. Berg, K. Nyström","doi":"10.1016/j.jcmds.2021.100008","DOIUrl":"https://doi.org/10.1016/j.jcmds.2021.100008","url":null,"abstract":"","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83391482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Winfried Auzinger , Iva Březinová , Alexander Grosz , Harald Hofstätter , Othmar Koch , Takeshi Sato
{"title":"Efficient adaptive exponential time integrators for nonlinear Schrödinger equations with nonlocal potential","authors":"Winfried Auzinger , Iva Březinová , Alexander Grosz , Harald Hofstätter , Othmar Koch , Takeshi Sato","doi":"10.1016/j.jcmds.2021.100014","DOIUrl":"10.1016/j.jcmds.2021.100014","url":null,"abstract":"<div><p>The performance of exponential-based numerical integrators for the time propagation of the equations associated with the multiconfiguration time-dependent Hartree–Fock (MCTDHF) method for the approximation of the multi-particle Schrödinger equation in one space dimension is assessed. Among the most popular integrators such as Runge–Kutta methods, time-splitting, exponential integrators and Lawson methods, exponential Lawson multistep methods with one predictor–corrector step provide the best stability and accuracy at the least effort. This assessment is based on the observation that the evaluation of the nonlocal terms associated with the potential is the computationally most demanding part of such a calculation in our setting. In addition, the predictor step provides an estimator for the local time-stepping error, thus allowing for adaptive time-stepping which reflects the smoothness of the solution and enables to reliably control the accuracy of a computation in a robust way, without the need to guess an optimal stepsize a priori. One-dimensional model examples are studied to compare different time integrators and demonstrate the successful application of our adaptive methods.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"1 ","pages":"Article 100014"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772415821000079/pdfft?md5=37b574c538b0935ceb3731af09bc19cf&pid=1-s2.0-S2772415821000079-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77624988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Neural networks as smooth priors for inverse problems for PDEs","authors":"Jens Berg, Kaj Nyström","doi":"10.1016/j.jcmds.2021.100008","DOIUrl":"https://doi.org/10.1016/j.jcmds.2021.100008","url":null,"abstract":"<div><p>In this paper we discuss the potential of using artificial neural networks as smooth priors in classical methods for inverse problems for PDEs. Exploring that neural networks are global and smooth function approximators, the idea is that neural networks could act as attractive priors for the coefficients to be estimated from noisy data. We illustrate the capabilities of neural networks in the context of the Poisson equation and we show that the neural network approach show robustness with respect to noisy, incomplete data and with respect to mesh and geometry.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"1 ","pages":"Article 100008"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772415821000043/pdfft?md5=08c6cc3f4e5c45de91102c997960531d&pid=1-s2.0-S2772415821000043-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91677866","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Jordan canonical forms for systems of elliptic equations","authors":"Mosito Lekhooana , Motlatsi Molati , Celestin Wafo Soh","doi":"10.1016/j.jcmds.2021.100006","DOIUrl":"https://doi.org/10.1016/j.jcmds.2021.100006","url":null,"abstract":"<div><p>This work involves the study of elliptic type systems of equations in three independent variables. The Lie point symmetries of the systems are obtained; some of the symmetries of a particular system are used to perform reduction to an invariant system with one less independent variable. The symmetries of the reduced system are also obtained and used for further reduction to a system of ordinary differential equations (ODEs). The invariant solutions of the system of ODEs are constructed.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"1 ","pages":"Article 100006"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.jcmds.2021.100006","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91722814","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Testing pairs of continuous random variables for independence: A simple heuristic","authors":"Mahfuza Khatun , Sikandar Siddiqui","doi":"10.1016/j.jcmds.2021.100012","DOIUrl":"https://doi.org/10.1016/j.jcmds.2021.100012","url":null,"abstract":"<div><p>Detection and examination of pairwise dependence patterns between continuous variables is among the central tasks in the fields of business and economic statistics. To perform this analysis, practitioners frequently resort to Pearson’s (1895) product–moment correlation coefficient and the related significance tests. However, the use of such tests in isolation involves the risk of missing the nonlinear and particularly non-monotonic associations between the variables. This problem is also relevant in the cases where the dependence prevails between higher-order moments, e.g., variances, rather than means. We present a simple, computationally inexpensive heuristic by which this problem can be addressed and demonstrate its usefulness in a small number of example cases.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"1 ","pages":"Article 100012"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772415821000067/pdfft?md5=660c506deaddd9e565da02559154d7a3&pid=1-s2.0-S2772415821000067-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91722825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Contagion-induced risk: An application to the global export network","authors":"E. Vicente, A. Mateos, E. Mateos","doi":"10.1016/j.jcmds.2021.100010","DOIUrl":"https://doi.org/10.1016/j.jcmds.2021.100010","url":null,"abstract":"<div><p>In many systems, the state of each of their components can itself be a source of risk affecting the other components, and it is not easy to aggregate these individual values together with the interconnecting structural elements of the network. There are simulation models in the literature that establish propagation curves for the population as a whole, especially in the epidemiological case, but these models do not provide a clear analytical expression of the risk borne by each of the network nodes. Moreover, classical models, such as the generalized cascade model, are not necessarily convergent. Neither can the individual values of each node be aggregated on the same scale as they were measured. This paper proposes a mathematical model that makes it possible to analyze the propagation of risk in the face of a given adverse event that may reach all the elements of a network and precisely calculate the risk borne by each node according to its own vulnerability and the relationships with the other nodes, which may be more or less vulnerable and constitute additional sources of risk. It is shown that the new model ensures convergence and that the aggregated results can be interpreted in terms of the risk measurement scale previously given for each node. In addition, the global import–export network is used to illustrate how political or economic instability in one state can generate crises in other states.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"1 ","pages":"Article 100010"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772415821000055/pdfft?md5=82f3893819e0ad2c6abff625b636d520&pid=1-s2.0-S2772415821000055-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91677865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}