{"title":"Workshop: Digital Discovery with Bootable CDs","authors":"R. Moll, M. Prokop, H. Morgenstern","doi":"10.1109/IMF.2009.20","DOIUrl":"https://doi.org/10.1109/IMF.2009.20","url":null,"abstract":"Boot-CDs are a flexible and powerful method to assist in the whole forensic process from live examination to acquisition, searching and recovery. Linux was ever since the most popular OS for this purpose, but in some cases windows-based Live-CDs are also useful. In this workshop we present different real-life case scenarios and the corresponding live-boot-solution. Since kernel 2.6 Linux is able to create forensically sound images even of partitions/harddisks with odd sectors. But one has to be aware of a lot of other circumstances which can alter the evidence: mounting filesystems, automatic activation of software RAID arrays, using LVMs or swap-space on the target disk. A lot of Linux-Boot-CDs seem to take care of all the critical points, but in fact there are only few well documented tests available. Another problem of the ready-to-download Linux Live-CD images is the lack of support for brand new hardware. So a framework to build a custom linux-live-system with current kernel versions and packages would be really helpful. We will present grml, a Debian based live system, developed by the Austrian Debian Developer Michael Prokop and the grml team. This system satisfies all the above mentioned initial conditions and much more. Various boot parameters allow to control the behavior of the live system, e.g. the parameter \"forensic\", which is a shortcut for \"nofstab noraid noautoconfig noswap raid=noautodetect readonly ...\". Additionally the grml system can be booted from CD/DVD, USB-/Firewire-Device, Remote-Adapter (iLO, RSA2, ...), Flash-Card and PXE. In this workshop you'll learn how to use grml for forensic investigations and how to build your own live system using the grml-live framework. On some brand-new mainboards the grml system might still fail, because the chipset, especially the onboard-raid-chipset is not yet supported by the linux kernel. For these cases a forensically sound windows-based boot-CD as plan B is needed. So the workshop will present a way to build a forensically sound windows based boot CD using the standard Windows Automated Installation Kit for Windows Vista along with some registry modifications.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129737281","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Semi-autonomous Link Layer Vulnerability Discovery and Mitigation Dissemination","authors":"Ziyad S. Al-Salloum, S. Wolthusen","doi":"10.1109/IMF.2009.14","DOIUrl":"https://doi.org/10.1109/IMF.2009.14","url":null,"abstract":"Risk and vulnerability management is a critical task in maintaining any nontrivial network, but made increasingly difficult by the dynamic nature of internetworking, transient connectivity, and the use of virtual machines that are connected intermittently, while both real and virtual hosts may harbor vulnerabilities that must be addressed to protect both the vulnerable host and its environment whether these are known to an organization’s asset database or not. This is particularly critical if a security incident is in progress and the exposure to a vulnerability must be assessed and potentially mitigated as quickly and completely as possible. In this paper we therefore propose a probabilistic discovery and mitigation algorithm traversing a network with only knowledge of the immediate network neighborhood as can be obtained from passive observation of the LLDP protocol to minimize bandwidth consumption in con- junction with persistent agents deployed by the traversal to capture transient or intermittently active nodes and provide an analysis of the algorithm’s efficiency under different topologies and taking into account link failure as well as inconclusive or failed discovery and mitigation operation probabilities.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128982469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Kaemarungsi, Nawattapon Yoskamtorn, Kitisak Jirawannakool, Nuttapong Sanglerdsinlapachai, C. Luangingkasut
{"title":"Botnet Statistical Analysis Tool for Limited Resource Computer Emergency Response Team","authors":"K. Kaemarungsi, Nawattapon Yoskamtorn, Kitisak Jirawannakool, Nuttapong Sanglerdsinlapachai, C. Luangingkasut","doi":"10.1109/IMF.2009.13","DOIUrl":"https://doi.org/10.1109/IMF.2009.13","url":null,"abstract":"Botnet is recognized as one of the fastest growing threat to the Internet and most users do not aware that they were victimized. ThaiCERT is one of many computer emergency response teams that have limited resources in term of budget to monitor and handle this kind of threat. An interim solution for teams with limited resource is to subscribe to the Shadowserver Foundation’s mailing list instead of deploying their own capturing and monitoring tools. The valuable information from the Shadowserver Foundation in form of plaintext e-mails may be difficult to manage and analyze. However, there is a need to analyze information provided by the Shadowserver Foundation to be able to efficiently handle botnet’s incidents for our own constituency. In this manuscript, we present our approach to handle the botnet threat using available information from the Shadowserver Foundation and describe our automate tool using by our incident handling team. Finally, we present our statistical data on botnet’s threat in our constituency over the last two years.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129119362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Technique to Interrogate an Image of RAM","authors":"Mark Wozar","doi":"10.1109/IMF.2009.10","DOIUrl":"https://doi.org/10.1109/IMF.2009.10","url":null,"abstract":"Using Mr. Aaron Walters' Python script, nistpe.py, which generates hash values for sections within Microsoft Windows portable executables (PE), I will present a technique allowing industry, academia, law-enforcement, and other government bodies to create custom reference sets that detect sections within a raw bit image of random access memory. The technique identifies PE sections within a raw bit image of random access memory by comparing SHA-1 hash values from page-aligned segments to SHA-1 reference file entries. This technique expands on the “immutable sections of known executables” reported earlier. Being able to identify PEs by hash values may facilitate volatile memory analysis and warn of malicious logic.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116225150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Automated User Transparent Approach to log Web URLs for Forensic Analysis","authors":"Muhammad Kamran Ahmed, Mukhtar Hussain, Asad Raza","doi":"10.1109/IMF.2009.12","DOIUrl":"https://doi.org/10.1109/IMF.2009.12","url":null,"abstract":"This paper presents an automated approach to record web activity as the user connects to Internet. It includes monitoring and logging of web URLs visited by the user. The distinctive features of this approach are a) it starts automatically, b) it is transparent to users, c) it is robust against intentional or un-intentional process kill, and d) it is robust against intentional or un-intentional corruption or deletion of log file. The first feature is achieved as the program/application will run with svchost.exe service which is initiated automatically. Transparency is achieved by storing the log file to a default hidden location defined by system variables as well as at a third location (logging server) on the network. Process killing is prevented through dependencies of this application on essential service required to connect to network and thus World Wide Web. The last feature determines that a log activity is also stored in logging server (not accessible to users) even if a user deletes or corrupts it from his local system. The log file contains important information of client, username, date and time of activity and URLs visited. The approach can give vital and potential evidential information of corporate web policy violations, employee monitoring, and law enforcement agencies (digital forensics investigators). This paper also carries out a comparative analysis of the performance and security of proposed scheme against some existing Web forensic and antiforensic tools.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129515544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Safe-Keeping Digital Evidence with Secure Logging Protocols: State of the Art and Challenges","authors":"R. Accorsi","doi":"10.1109/IMF.2009.18","DOIUrl":"https://doi.org/10.1109/IMF.2009.18","url":null,"abstract":"While log data are being increasingly used as digital evidence in court, the extent to which existing secure logging protocols used to collect log data fulfill the legal requirements for admissible evidence remain largely unclear. This paper elucidates a subset of the necessary secure requirements for digital evidence and extensively surveys the state of the art secure logging protocols, thereby demonstrating that none of the current protocols completely fulfills the elucidated requirements for admissible evidence. In analyzing the shortcoming of logging protocols, the paper also elaborates on the related research challenges.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116902418","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fast User Classifying to Establish Forensic Analysis Priorities","authors":"A. Grillo, Alessandro Lentini, G. Me, M. Ottoni","doi":"10.1109/IMF.2009.16","DOIUrl":"https://doi.org/10.1109/IMF.2009.16","url":null,"abstract":"In computer and common crimes, important evidence or clues are increasingly stored in the computers hard disks. The huge and increasing penetration of computers in the daily life together with a considerable increase of storage capacity in mass-market computers, pose, currently, new challenges to forensic operators. Usually a digital forensic investigator has to spend a lot of time in order to find documents, clues or evidence related to the investigation among the huge amount of data extracted from one or more sized hard drive. In particular, the seized material could be very huge, and, very often, only few devices are considered relevant for the investigation. In this paper we propose a methodology and a tool to support a fast computer user profiling via a classification into investigator-defined categories in order to quickly classify the seized computer user. The main purpose of the methodology discussed is to define the class of the user in order to establish an effective schedule with priorities based on the computer user content.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116310434","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Self-Forensics Through Case Studies of Small-to-Medium Software Systems","authors":"Serguei A. Mokhov, Emil Vassev","doi":"10.1109/IMF.2009.19","DOIUrl":"https://doi.org/10.1109/IMF.2009.19","url":null,"abstract":"The notion and definition of self-forensics was introduced by Mokhov to encompass software and hardware capabilities for autonomic and other systems to record their own states, events, and others encoded in a forensic form suitable for (potentially automated) forensic analysis, evidence modeling and specification, and event reconstruction for various system components. For self-forensics, “self-dissection” is possible for analysis using a standard language and decision making if the system includes such a self-forensic subsystem. The self-forensic evidence is encoded in a cyberforensic investigation case and event reconstruction language, Forensic Lucid. The encoding of the stories depicted by the evidence comprise a context as a first-class value of a Forensic Lucid “program”, after which an investigator models the case describing relationships between various events and pieces of information. It is important to get the context right for the case to have a meaning and the proper meaning computation, so we perform case studies of some small-to-medium, distributed and not, primarily academic open-source software systems. In this work, for the purpose of implementation of the small self-forensic modules for the data structures and event flow, we specify the requirements of what the context should be for those systems. The systems share in common the base programming language – Java, so our self-forensic logging of the Java data structures and events as Forensic Lucid context specification expressions is laid out ready for an investigator to examine and model the case.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114735975","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Forensic Image Generator Generator (Forensig2)","authors":"Christian Moch, F. Freiling","doi":"10.1109/IMF.2009.8","DOIUrl":"https://doi.org/10.1109/IMF.2009.8","url":null,"abstract":"We describe a system that allows to produce file system images for training courses in forensic computing. The instructor can “program” certain user behavior (like copying files and deleting them) in a script file which is then executed by the system using a combination of Python and Qemu. The result is a file system image that can be analysed by students within exercises on forensic computing. The analysis results of the students can then be compared with the “truth” encoded in the input script. The system therefore allows to easily generate large numbers of artificial but still challenging images without the privacy concerns of, for example, using and analysing second hand hard disks.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"188 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123032187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Comprehensive and Comparative Analysis of the Patching Behavior of Open Source and Closed Source Software Vendors","authors":"G. Schryen","doi":"10.1109/IMF.2009.15","DOIUrl":"https://doi.org/10.1109/IMF.2009.15","url":null,"abstract":"While many theoretical arguments against or in favor of open source and closed source software development have been presented, the empirical basis for the assessment of arguments is still weak. Addressing this research gap, this paper presents a comprehensive empirical investigation of the patching behavior of software vendors/communities of widely deployed open source and closed source software packages, including operating systems, database systems, web browsers, email clients, and office systems. As the value of any empirical study relies on the quality of data available, this paper also discusses in detail data issues, explains to what extent the empirical analysis can be based on vulnerability data contained in the NIST National Vulnerability Database, and shows how data on vulnerability patches was collected by the author to support this study. The results of the analysis suggest that it is not the particular software development style that determines patching behavior, but rather the policy of the particular software vendor.","PeriodicalId":370893,"journal":{"name":"2009 Fifth International Conference on IT Security Incident Management and IT Forensics","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132803214","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}