Lisa Rzepka , Jenny Ottmann , Radina Stoykova , Felix Freiling , Harald Baier
{"title":"A scenario-based quality assessment of memory acquisition tools and its investigative implications","authors":"Lisa Rzepka , Jenny Ottmann , Radina Stoykova , Felix Freiling , Harald Baier","doi":"10.1016/j.fsidi.2025.301868","DOIUrl":null,"url":null,"abstract":"<div><div>During digital forensic investigations volatile data from random-access memory (RAM) can provide crucial information such as access credentials or encryption keys. This data is usually obtained using software that copies contents of RAM to a memory dump file concurrently to normal system operation. It is well-known that this results in many inconsistencies in the copied data. Based on established quality criteria from the literature and on four typical investigative scenarios, we present and evaluate a methodology to assess the quality of memory acquisition tools in these scenarios. The methodology basically relates three factors: (1) the quality criteria of the memory dump, (2) the applied memory forensics analysis technique, and (3) its success in the given investigative scenario. We apply our methodology to four memory acquisition tools (from both the open source and the commercial community). It turns out that all tools have weaknesses but that their inconsistencies appear to be not as bad as anticipated. Another finding is that unstructured memory analysis methods are more robust against low quality (i.e., inconsistent) memory dumps than structured analysis methods. We provide the measurement dataset together with the tool by which it was acquired and also examine our findings in the context of legal and international standards for digital forensics in law enforcement investigations.</div></div>","PeriodicalId":48481,"journal":{"name":"Forensic Science International-Digital Investigation","volume":"52 ","pages":"Article 301868"},"PeriodicalIF":2.0000,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Forensic Science International-Digital Investigation","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666281725000071","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
During digital forensic investigations volatile data from random-access memory (RAM) can provide crucial information such as access credentials or encryption keys. This data is usually obtained using software that copies contents of RAM to a memory dump file concurrently to normal system operation. It is well-known that this results in many inconsistencies in the copied data. Based on established quality criteria from the literature and on four typical investigative scenarios, we present and evaluate a methodology to assess the quality of memory acquisition tools in these scenarios. The methodology basically relates three factors: (1) the quality criteria of the memory dump, (2) the applied memory forensics analysis technique, and (3) its success in the given investigative scenario. We apply our methodology to four memory acquisition tools (from both the open source and the commercial community). It turns out that all tools have weaknesses but that their inconsistencies appear to be not as bad as anticipated. Another finding is that unstructured memory analysis methods are more robust against low quality (i.e., inconsistent) memory dumps than structured analysis methods. We provide the measurement dataset together with the tool by which it was acquired and also examine our findings in the context of legal and international standards for digital forensics in law enforcement investigations.