{"title":"关键自治和人工智能系统的代理验证和验证","authors":"P. Laplante, M. Kassab, J. Defranco","doi":"10.1109/STC55697.2022.00014","DOIUrl":null,"url":null,"abstract":"A challenging problem for software and systems engineers is to provide assurance of operations for a system that is critical but must operate in situations that cannot be easily created in the testing lab. For example, a space system cannot be fully tested in all operational modes until it is launched and nuclear power plants cannot be tested under real critical temperature overload conditions. This situation is particularly challenging when seeking to provide assurance in critical AI systems (CAIS) where the underlying algorithms may be very difficult to verify under any conditions. In these cases using systems that have a similar underlying application, operational profiles, user characteristics, and underlying AI algorithms may be suitable as testing proxies. For example, a robot vacuum may have significant operational and implementation similarities to act as a testing proxy for some aspects of an autonomous vehicle.In this work we discuss the challenges in assured autonomy for CAIS and suggest a way forward using proxy systems. We describe a methodology for characterizing CAIS and matching them to their non-critical proxy equivalent. Examples are given along with a discussion of the history of other kinds of proxy verification and validation","PeriodicalId":170123,"journal":{"name":"2022 IEEE 29th Annual Software Technology Conference (STC)","volume":"236 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Proxy Verification and Validation For Critical Autonomous and AI Systems\",\"authors\":\"P. Laplante, M. Kassab, J. Defranco\",\"doi\":\"10.1109/STC55697.2022.00014\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A challenging problem for software and systems engineers is to provide assurance of operations for a system that is critical but must operate in situations that cannot be easily created in the testing lab. For example, a space system cannot be fully tested in all operational modes until it is launched and nuclear power plants cannot be tested under real critical temperature overload conditions. This situation is particularly challenging when seeking to provide assurance in critical AI systems (CAIS) where the underlying algorithms may be very difficult to verify under any conditions. In these cases using systems that have a similar underlying application, operational profiles, user characteristics, and underlying AI algorithms may be suitable as testing proxies. For example, a robot vacuum may have significant operational and implementation similarities to act as a testing proxy for some aspects of an autonomous vehicle.In this work we discuss the challenges in assured autonomy for CAIS and suggest a way forward using proxy systems. We describe a methodology for characterizing CAIS and matching them to their non-critical proxy equivalent. Examples are given along with a discussion of the history of other kinds of proxy verification and validation\",\"PeriodicalId\":170123,\"journal\":{\"name\":\"2022 IEEE 29th Annual Software Technology Conference (STC)\",\"volume\":\"236 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 29th Annual Software Technology Conference (STC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/STC55697.2022.00014\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 29th Annual Software Technology Conference (STC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/STC55697.2022.00014","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Proxy Verification and Validation For Critical Autonomous and AI Systems
A challenging problem for software and systems engineers is to provide assurance of operations for a system that is critical but must operate in situations that cannot be easily created in the testing lab. For example, a space system cannot be fully tested in all operational modes until it is launched and nuclear power plants cannot be tested under real critical temperature overload conditions. This situation is particularly challenging when seeking to provide assurance in critical AI systems (CAIS) where the underlying algorithms may be very difficult to verify under any conditions. In these cases using systems that have a similar underlying application, operational profiles, user characteristics, and underlying AI algorithms may be suitable as testing proxies. For example, a robot vacuum may have significant operational and implementation similarities to act as a testing proxy for some aspects of an autonomous vehicle.In this work we discuss the challenges in assured autonomy for CAIS and suggest a way forward using proxy systems. We describe a methodology for characterizing CAIS and matching them to their non-critical proxy equivalent. Examples are given along with a discussion of the history of other kinds of proxy verification and validation