{"title":"Cryptanalysis of a hash function, and the modular subset sum problem","authors":"C. Monico","doi":"10.1515/gcc-2019-2001","DOIUrl":null,"url":null,"abstract":"Abstract Recently, Shpilrain and Sosnovski proposed a hash function based on composition of affine maps. In this paper, we show that this hash function with its proposed parameters is not weak collision resistant, for plaintexts of size at least 1.9MB (about 2 24 {2^{24}} bits). Our approach is to reduce the preimage problem to a (very) high density instance of the Random Modular Subset Sum Problem, for which we give an algorithm capable of solving instances of the resulting size. Specifically, given plaintexts of about 1.9MB, we were able to produce other plaintexts of the same size with the same hash value in about 13 hours each, on average.","PeriodicalId":41862,"journal":{"name":"Groups Complexity Cryptology","volume":"17 1","pages":"17 - 23"},"PeriodicalIF":0.1000,"publicationDate":"2019-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Groups Complexity Cryptology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1515/gcc-2019-2001","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 3
Abstract
Abstract Recently, Shpilrain and Sosnovski proposed a hash function based on composition of affine maps. In this paper, we show that this hash function with its proposed parameters is not weak collision resistant, for plaintexts of size at least 1.9MB (about 2 24 {2^{24}} bits). Our approach is to reduce the preimage problem to a (very) high density instance of the Random Modular Subset Sum Problem, for which we give an algorithm capable of solving instances of the resulting size. Specifically, given plaintexts of about 1.9MB, we were able to produce other plaintexts of the same size with the same hash value in about 13 hours each, on average.