{"title":"国防部实时系统软件复杂性分析","authors":"William G. Castaneda","doi":"10.1109/HASE.1997.648051","DOIUrl":null,"url":null,"abstract":"I am grateful to Mr. Ray Paul and Dr. Yen for inviting me to participate in the 2nd IEEE High Assurance Systems Engineering Workshop. As a software analyst representative from Commander, Operational Test and Evaluation Force (COMOPTEVFOR), I am eager to learn about the latest technological research in developing and evaluating high assurance systems. I would also like to share my warfighter’s perspective and expectations on high assurance systems. COMOPTEVFOR is the U.S. Navy’s sole agent for operational test and evaluation (OT&E). For new and improved capabilities proposed for fleet use, our mission includes: conducting OT&E in a realistic environment against the anticipated threat; advising the Chief of Naval Operations on system operational effectiveness, operational suitability and make recommendations for fleet introduction. Approximately 210 naval officers and non-commissioned officers, representing every warfare area, are assigned as Operational Test Directors and responsible for planning, testing and evaluating over 38 1 programs. The most important part of their job is to make a determination of system effectiveness and suitability prior to the system being delivered to the fleet. Without a doubt, high assurance systems are of critical importance to the U.S. Navy and its ability to maintain operational readiness. From an operational tester’s perspective, “high assurance” means will the system be dependable, reliable, available, and can operators maintain the system? COMOPTEVFOR has recognized the need to become more familiar with the capabilities and limitations of systems which include software critical functionality. Early operational tester involvement into high assurance system development will support realistic operational assessment of system suitability. For the Navy, operational suitability is the capability of the system, when operated and maintained by typical fleet personnel in the expected numbers and of the expected experience level, to be reliable, maintainable, operationally available, logistically supportable when deployed, compatible, interoperable, and safe. COMOPTEVFOR early involvement and participation in risk mitigation are alive and well as exemplified during a recent visit to a contractor’s site for a review of operational-interest software code. An operational requirements-to-code trace was performed by COMOPTEVFOR representatives, subsequently leading to identifying the need for a contractor’s lab demonstration of the weapon to clarify a firing feature. The high-assurance issue at hand: “Will this weapon fire when it is not supposed to?’ The end result was an inadvertent firing and a Priority 1 Software Trouble Report (Safety). Yes, this was gloomy in a sense for the contractor, but the problem was fixed before the upcoming at-sea test and thus a safety-related failure was avoided. This early failure detection prevented an expensive fix and re-test later. A multi-service research project led by COMOPTEVFOR has been initiated to determine the capabilities and limitations of software CASE (Computer Aided Software Engineering) analysis tools on highly complex real-time and near real-time systems. Both manual and automated techniques are being evaluated across three Department of Defense projects to gain more insight into real-time characteristics of software-intensive systems. Preliminary data point to the benefits of using more than one source of information to mitigate development risks. Specifically, analysis has demonstrated source code structure changes (made possible through the use of a manual checklist) can result in lower CASE tool complexity values (good from the tool aspect) but slower algorithm execution times (bad from the user’s perspective). This is an important discovery in light of the fact that some program management guidelines include a requirement for developers to stay below a particular CASE tool driven complexity figure. The manual code verification checklist has provided a vehicle for investigating areas not directly analyzed by the automated tool. Positive aspects of the generic checklist include cross-project comparison of requirements traceability and pseudo-code structure.","PeriodicalId":319609,"journal":{"name":"Proceedings 1997 High-Assurance Engineering Workshop","volume":"144 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1997-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Software complexity analysis on department of defense real-time systems\",\"authors\":\"William G. Castaneda\",\"doi\":\"10.1109/HASE.1997.648051\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"I am grateful to Mr. Ray Paul and Dr. Yen for inviting me to participate in the 2nd IEEE High Assurance Systems Engineering Workshop. As a software analyst representative from Commander, Operational Test and Evaluation Force (COMOPTEVFOR), I am eager to learn about the latest technological research in developing and evaluating high assurance systems. I would also like to share my warfighter’s perspective and expectations on high assurance systems. COMOPTEVFOR is the U.S. Navy’s sole agent for operational test and evaluation (OT&E). For new and improved capabilities proposed for fleet use, our mission includes: conducting OT&E in a realistic environment against the anticipated threat; advising the Chief of Naval Operations on system operational effectiveness, operational suitability and make recommendations for fleet introduction. Approximately 210 naval officers and non-commissioned officers, representing every warfare area, are assigned as Operational Test Directors and responsible for planning, testing and evaluating over 38 1 programs. The most important part of their job is to make a determination of system effectiveness and suitability prior to the system being delivered to the fleet. Without a doubt, high assurance systems are of critical importance to the U.S. Navy and its ability to maintain operational readiness. From an operational tester’s perspective, “high assurance” means will the system be dependable, reliable, available, and can operators maintain the system? COMOPTEVFOR has recognized the need to become more familiar with the capabilities and limitations of systems which include software critical functionality. Early operational tester involvement into high assurance system development will support realistic operational assessment of system suitability. For the Navy, operational suitability is the capability of the system, when operated and maintained by typical fleet personnel in the expected numbers and of the expected experience level, to be reliable, maintainable, operationally available, logistically supportable when deployed, compatible, interoperable, and safe. COMOPTEVFOR early involvement and participation in risk mitigation are alive and well as exemplified during a recent visit to a contractor’s site for a review of operational-interest software code. An operational requirements-to-code trace was performed by COMOPTEVFOR representatives, subsequently leading to identifying the need for a contractor’s lab demonstration of the weapon to clarify a firing feature. The high-assurance issue at hand: “Will this weapon fire when it is not supposed to?’ The end result was an inadvertent firing and a Priority 1 Software Trouble Report (Safety). Yes, this was gloomy in a sense for the contractor, but the problem was fixed before the upcoming at-sea test and thus a safety-related failure was avoided. This early failure detection prevented an expensive fix and re-test later. A multi-service research project led by COMOPTEVFOR has been initiated to determine the capabilities and limitations of software CASE (Computer Aided Software Engineering) analysis tools on highly complex real-time and near real-time systems. Both manual and automated techniques are being evaluated across three Department of Defense projects to gain more insight into real-time characteristics of software-intensive systems. Preliminary data point to the benefits of using more than one source of information to mitigate development risks. Specifically, analysis has demonstrated source code structure changes (made possible through the use of a manual checklist) can result in lower CASE tool complexity values (good from the tool aspect) but slower algorithm execution times (bad from the user’s perspective). This is an important discovery in light of the fact that some program management guidelines include a requirement for developers to stay below a particular CASE tool driven complexity figure. The manual code verification checklist has provided a vehicle for investigating areas not directly analyzed by the automated tool. Positive aspects of the generic checklist include cross-project comparison of requirements traceability and pseudo-code structure.\",\"PeriodicalId\":319609,\"journal\":{\"name\":\"Proceedings 1997 High-Assurance Engineering Workshop\",\"volume\":\"144 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1997-08-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings 1997 High-Assurance Engineering Workshop\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HASE.1997.648051\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings 1997 High-Assurance Engineering Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HASE.1997.648051","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Software complexity analysis on department of defense real-time systems
I am grateful to Mr. Ray Paul and Dr. Yen for inviting me to participate in the 2nd IEEE High Assurance Systems Engineering Workshop. As a software analyst representative from Commander, Operational Test and Evaluation Force (COMOPTEVFOR), I am eager to learn about the latest technological research in developing and evaluating high assurance systems. I would also like to share my warfighter’s perspective and expectations on high assurance systems. COMOPTEVFOR is the U.S. Navy’s sole agent for operational test and evaluation (OT&E). For new and improved capabilities proposed for fleet use, our mission includes: conducting OT&E in a realistic environment against the anticipated threat; advising the Chief of Naval Operations on system operational effectiveness, operational suitability and make recommendations for fleet introduction. Approximately 210 naval officers and non-commissioned officers, representing every warfare area, are assigned as Operational Test Directors and responsible for planning, testing and evaluating over 38 1 programs. The most important part of their job is to make a determination of system effectiveness and suitability prior to the system being delivered to the fleet. Without a doubt, high assurance systems are of critical importance to the U.S. Navy and its ability to maintain operational readiness. From an operational tester’s perspective, “high assurance” means will the system be dependable, reliable, available, and can operators maintain the system? COMOPTEVFOR has recognized the need to become more familiar with the capabilities and limitations of systems which include software critical functionality. Early operational tester involvement into high assurance system development will support realistic operational assessment of system suitability. For the Navy, operational suitability is the capability of the system, when operated and maintained by typical fleet personnel in the expected numbers and of the expected experience level, to be reliable, maintainable, operationally available, logistically supportable when deployed, compatible, interoperable, and safe. COMOPTEVFOR early involvement and participation in risk mitigation are alive and well as exemplified during a recent visit to a contractor’s site for a review of operational-interest software code. An operational requirements-to-code trace was performed by COMOPTEVFOR representatives, subsequently leading to identifying the need for a contractor’s lab demonstration of the weapon to clarify a firing feature. The high-assurance issue at hand: “Will this weapon fire when it is not supposed to?’ The end result was an inadvertent firing and a Priority 1 Software Trouble Report (Safety). Yes, this was gloomy in a sense for the contractor, but the problem was fixed before the upcoming at-sea test and thus a safety-related failure was avoided. This early failure detection prevented an expensive fix and re-test later. A multi-service research project led by COMOPTEVFOR has been initiated to determine the capabilities and limitations of software CASE (Computer Aided Software Engineering) analysis tools on highly complex real-time and near real-time systems. Both manual and automated techniques are being evaluated across three Department of Defense projects to gain more insight into real-time characteristics of software-intensive systems. Preliminary data point to the benefits of using more than one source of information to mitigate development risks. Specifically, analysis has demonstrated source code structure changes (made possible through the use of a manual checklist) can result in lower CASE tool complexity values (good from the tool aspect) but slower algorithm execution times (bad from the user’s perspective). This is an important discovery in light of the fact that some program management guidelines include a requirement for developers to stay below a particular CASE tool driven complexity figure. The manual code verification checklist has provided a vehicle for investigating areas not directly analyzed by the automated tool. Positive aspects of the generic checklist include cross-project comparison of requirements traceability and pseudo-code structure.