{"title":"安全产品设计,法医工程,以及阿西莫夫机器人定律","authors":"L. F. Bilancia","doi":"10.1109/ISPCE.2014.6841995","DOIUrl":null,"url":null,"abstract":"Isaac Asimov wrote a series of science fiction stories regarding failure analysis of complex systems: his fictional positronic brained robots. The stories revolved around his “Three Laws of Robotics”. One, a robot may not injure a human being, or, through inaction allow a human being to come to harm. Two, a robot must obey the orders given it by human beings except where such orders would conflict with the First Law. Three, a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. We are surrounded by automated systems that routinely violate these Three Laws, yet some systems, such as implanted pacemakers and defibrillators have specific and distinct circuitry and firmware that implement exactly these rules. Furthermore, as engineers we are called upon to evaluate systems that have failed, determine root cause, and assist the courts in determining culpability. This paper presents a series of examples of systems that are well implemented examples of Asimov's Three Laws, of systems that categorically fail to implement the Three Laws, tie the Three Laws into the Criticality and Severity Analysis (FMEA/CA/SA) Failure Modes and Effects Analysis standards, and examine the use of The Laws in forensic engineering and failure analysis.","PeriodicalId":262617,"journal":{"name":"2014 IEEE Symposium on Product Compliance Engineering (ISPCE)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Safe product design, forensic engineering, and Asimov's Laws of Robotics\",\"authors\":\"L. F. Bilancia\",\"doi\":\"10.1109/ISPCE.2014.6841995\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Isaac Asimov wrote a series of science fiction stories regarding failure analysis of complex systems: his fictional positronic brained robots. The stories revolved around his “Three Laws of Robotics”. One, a robot may not injure a human being, or, through inaction allow a human being to come to harm. Two, a robot must obey the orders given it by human beings except where such orders would conflict with the First Law. Three, a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. We are surrounded by automated systems that routinely violate these Three Laws, yet some systems, such as implanted pacemakers and defibrillators have specific and distinct circuitry and firmware that implement exactly these rules. Furthermore, as engineers we are called upon to evaluate systems that have failed, determine root cause, and assist the courts in determining culpability. This paper presents a series of examples of systems that are well implemented examples of Asimov's Three Laws, of systems that categorically fail to implement the Three Laws, tie the Three Laws into the Criticality and Severity Analysis (FMEA/CA/SA) Failure Modes and Effects Analysis standards, and examine the use of The Laws in forensic engineering and failure analysis.\",\"PeriodicalId\":262617,\"journal\":{\"name\":\"2014 IEEE Symposium on Product Compliance Engineering (ISPCE)\",\"volume\":\"53 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-05-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 IEEE Symposium on Product Compliance Engineering (ISPCE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISPCE.2014.6841995\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE Symposium on Product Compliance Engineering (ISPCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISPCE.2014.6841995","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Safe product design, forensic engineering, and Asimov's Laws of Robotics
Isaac Asimov wrote a series of science fiction stories regarding failure analysis of complex systems: his fictional positronic brained robots. The stories revolved around his “Three Laws of Robotics”. One, a robot may not injure a human being, or, through inaction allow a human being to come to harm. Two, a robot must obey the orders given it by human beings except where such orders would conflict with the First Law. Three, a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. We are surrounded by automated systems that routinely violate these Three Laws, yet some systems, such as implanted pacemakers and defibrillators have specific and distinct circuitry and firmware that implement exactly these rules. Furthermore, as engineers we are called upon to evaluate systems that have failed, determine root cause, and assist the courts in determining culpability. This paper presents a series of examples of systems that are well implemented examples of Asimov's Three Laws, of systems that categorically fail to implement the Three Laws, tie the Three Laws into the Criticality and Severity Analysis (FMEA/CA/SA) Failure Modes and Effects Analysis standards, and examine the use of The Laws in forensic engineering and failure analysis.