J. Mirkovic, S. Fahmy, P. Reiher, Roshan K. Thomas
{"title":"如何测试DoS防御","authors":"J. Mirkovic, S. Fahmy, P. Reiher, Roshan K. Thomas","doi":"10.1109/CATCH.2009.23","DOIUrl":null,"url":null,"abstract":"DoS defense evaluation methods influence how well test results predict performance in real deployment. This paper surveys existing approaches and criticizes their simplicity and the lack of realism. We summarize our work on improving DoS evaluation via development of standardized benchmarks and performance metrics. We end with guidelines on efficiently improving DoS evaluation, in the short and in the long term.","PeriodicalId":130933,"journal":{"name":"2009 Cybersecurity Applications & Technology Conference for Homeland Security","volume":"453 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"43","resultStr":"{\"title\":\"How to Test DoS Defenses\",\"authors\":\"J. Mirkovic, S. Fahmy, P. Reiher, Roshan K. Thomas\",\"doi\":\"10.1109/CATCH.2009.23\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"DoS defense evaluation methods influence how well test results predict performance in real deployment. This paper surveys existing approaches and criticizes their simplicity and the lack of realism. We summarize our work on improving DoS evaluation via development of standardized benchmarks and performance metrics. We end with guidelines on efficiently improving DoS evaluation, in the short and in the long term.\",\"PeriodicalId\":130933,\"journal\":{\"name\":\"2009 Cybersecurity Applications & Technology Conference for Homeland Security\",\"volume\":\"453 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-03-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"43\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2009 Cybersecurity Applications & Technology Conference for Homeland Security\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CATCH.2009.23\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 Cybersecurity Applications & Technology Conference for Homeland Security","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CATCH.2009.23","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
DoS defense evaluation methods influence how well test results predict performance in real deployment. This paper surveys existing approaches and criticizes their simplicity and the lack of realism. We summarize our work on improving DoS evaluation via development of standardized benchmarks and performance metrics. We end with guidelines on efficiently improving DoS evaluation, in the short and in the long term.