{"title":"使用工作流自动化恶意软件扫描","authors":"D. Stirling, I. Welch, P. Komisarczuk, C. Seifert","doi":"10.1109/CCGRID.2009.90","DOIUrl":null,"url":null,"abstract":"Identifying websites hosting malicious code is a priority for helping protect consumers using the web and for the collection of malicious code for analysis by malware researchers. We have been running an InternetNZ sponsored study where homepages of almost all New Zealand Web servers are scanned on a regular basis by a set of client honeypots. This paper reflects upon our experience of running moderate scale scans over a period of several months manually and identifies some requirements for automation of such a system using workflow and related middleware.","PeriodicalId":118263,"journal":{"name":"2009 9th IEEE/ACM International Symposium on Cluster Computing and the Grid","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automating Malware Scanning Using Workflows\",\"authors\":\"D. Stirling, I. Welch, P. Komisarczuk, C. Seifert\",\"doi\":\"10.1109/CCGRID.2009.90\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Identifying websites hosting malicious code is a priority for helping protect consumers using the web and for the collection of malicious code for analysis by malware researchers. We have been running an InternetNZ sponsored study where homepages of almost all New Zealand Web servers are scanned on a regular basis by a set of client honeypots. This paper reflects upon our experience of running moderate scale scans over a period of several months manually and identifies some requirements for automation of such a system using workflow and related middleware.\",\"PeriodicalId\":118263,\"journal\":{\"name\":\"2009 9th IEEE/ACM International Symposium on Cluster Computing and the Grid\",\"volume\":\"6 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-05-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2009 9th IEEE/ACM International Symposium on Cluster Computing and the Grid\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CCGRID.2009.90\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 9th IEEE/ACM International Symposium on Cluster Computing and the Grid","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCGRID.2009.90","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Identifying websites hosting malicious code is a priority for helping protect consumers using the web and for the collection of malicious code for analysis by malware researchers. We have been running an InternetNZ sponsored study where homepages of almost all New Zealand Web servers are scanned on a regular basis by a set of client honeypots. This paper reflects upon our experience of running moderate scale scans over a period of several months manually and identifies some requirements for automation of such a system using workflow and related middleware.