D. Batory, J. Barnett, J. Roy, B. C. Twichell, J. Garza
{"title":"Construction of file management systems from software components","authors":"D. Batory, J. Barnett, J. Roy, B. C. Twichell, J. Garza","doi":"10.1109/CMPSAC.1989.65108","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65108","url":null,"abstract":"Domain analysis, a classical approach to the indentification of reusable software modules, relies on in-depth studies of existing systems, published algorithms, and structures to discern generic architectures for large classes of systems. An architecture is a template in which building-block modules can be plugged. Interfaces are standardized to make blocks interchangeable. The authors explain how domain analysis has led them to a building-blocks technology for file management systems (FMSs), and they describe their most recent prototype, an FMS synthesizer. The synthesizer enables a customized FMS to be assembled from prewritten components in minutes at virtually no cost.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121825820","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On the definition and maintenance of database views with time-varying domains","authors":"M. Bassiouni, M. Llewellyn","doi":"10.1109/CMPSAC.1989.65086","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65086","url":null,"abstract":"An approach and a mechanism for the definition and maintenance of views within temporal database environments are outlined. The view definition language makes use of extending Boolean and comparison operators by allowing their operands to be sets of intervals. A mechanism for maintaining temporal views is presented, and it is shown that temporal database views have far fewer maintenance problems than their standard relational counterparts.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"174 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123056970","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Automated protocol modeling and verification combining an entity-based specification language and Petri nets","authors":"S. Shatz, Toshinori Suzuki, T. Murata","doi":"10.1109/CMPSAC.1989.65151","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65151","url":null,"abstract":"An approach for automated modeling and verification of communication protocols is presented. A language that specifies input/output behavior of protocol entities is introduced as the starting point, and some verification of the specifications is discussed. Further verification is aided by translation of the specifications to a timed Petri net model.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123128124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mamoru Sugahara, Akira Ohmori, Masahiro Tohyama, H. Ikegame, Tetsuya Takase
{"title":"Quality feedback in software development management-methodology and practical application","authors":"Mamoru Sugahara, Akira Ohmori, Masahiro Tohyama, H. Ikegame, Tetsuya Takase","doi":"10.1109/CMPSAC.1989.65133","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65133","url":null,"abstract":"Quality feedback (QFB) is a way to upgrade the quality of feedback between check and action in the plan-do-check-action (PDCA) cycle, especially the quality of feedback closely associated with the improvement of the software development process. To perform QFB, the authors have developed a methodology that includes a model for evaluating software development in phases and a series of methods spanning from data collection and evaluation to the feedback of findings. The application of the QFB methodology to software development projects is described.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121356261","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Parallel protocol verification using the two-phase algorithm","authors":"M. Yuang, A. Kershenbaum","doi":"10.1109/CMPSAC.1989.65080","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65080","url":null,"abstract":"A parallel protocol verification algorithm, called the two-phase algorithm, is proposed in an attempt to provide a maximum of verification with a minimum of state space. Rather than compose all communicating finite-state machines (FSMs) into one large global reachability tree, the two-phase algorithm constructs a local expanded tree for each FSM augmented with external information. The first phase of the algorithm performs the expanded tree construction and the second phase performs error detection based on the constructed expanded trees. By separating verification into two phases, the algorithm allows verification for all FSMs to be executed in parallel. The algorithm thus requires a shorter run-time. Moreover, by introducing a new method for the construction of the expanded trees, the algorithm requires fewer explored states. The algorithm can verify protocols with any number of processes. Verification for protocols with more than two processes is illustrated.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126437238","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Software testing for critical applications: a position paper","authors":"R. DeMillo","doi":"10.1109/CMPSAC.1989.65138","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65138","url":null,"abstract":"Summary form only given. The management of test and evaluation programs for life-critical software, particularly within the US Department of Defense, is largely fragmented and ineffective. As a result, software remains a significant and uncontrolled risk element in major systems. Some problems yet to be adequately addressed are identified.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129492634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A knowledge base for code reuse by similarity","authors":"Bonnie K. MacKellar, F. Maryanski","doi":"10.1109/CMPSAC.1989.65158","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65158","url":null,"abstract":"A description is given of WharfRat, a knowledge base of data type implementations which employs case-based reasoning as its primary retrieval mechanism. Given a description of an abstract data type, it retrieves the most similar data type implementation in the knowledge base. The focus of the study is the process by which two case descriptions are compared. Similarity between data types is modeled by a fuzzy relation. A set of similarity matching rules has been developed and implemented. The system employs a general, graph-based data model in which object types are organized in a specialization network. Abstract data representations are built using the constructs of the general data model. This system is the first step toward developing a complete programming-by-similarity system.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123882445","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The influence of manufacturing paradigms on system development methodologies","authors":"D. Coleman","doi":"10.1109/CMPSAC.1989.65179","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65179","url":null,"abstract":"Summary form only given. Four fundamental paradigms that can be used to characterize how manufacturers approach development of a product are discussed. They are: build to order, modify to suit, assemble to order, and off the shelf. Each of them is predicated on certain assumptions about the market at which the product is targeted.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"122 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116096615","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Parallelizing nested loops on multicomputers-the grouping approach","authors":"C. King, Ing-Ren Kau","doi":"10.1109/CMPSAC.1989.65074","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65074","url":null,"abstract":"The design of a tool for partitioning and parallelizing nested loops for execution on distributed-memory multicomputers is presented. The core of the tool is a technique called grouping, which identifies appropriate loop partition patterns based on data dependencies across the iterations. The grouping technique combined with analytic results from performance modeling tools will allow certain nested loops to be partitioned systematically and automatically, without users specifying the data partitions. Grouping is based on the concept of pipelined data parallel computation , which promises to achieve a balanced computation and communication on multicomputers. The basic structure of the parallelizing tool is presented. The grouping and performance analysis techniques for pipelined data parallel computations are described. A prototype of the tool is introduced to illustrate the feasibility of the approach.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125069359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Object-oriented programming in a conventional programming environment","authors":"D. Breen, P. Getto, A. Apodaca","doi":"10.1109/CMPSAC.1989.65105","DOIUrl":"https://doi.org/10.1109/CMPSAC.1989.65105","url":null,"abstract":"A programming methodology that implements many object-oriented features within a conventional programming environment is described. The methodology was created during the development of a computer animation system, The Clockworks. The methodology supports such object-oriented features as objects with variables and methods, class hierarchies, variable and method inheritance, object instantiation, and message passing. The methodology does not employ any special keywords or language extensions, thus removing the need for a preprocessor or compiler. The methodology has been implemented in a C/Unix environment. This allows the environment and any system developed within it to be ported to a wide variety of computers which support Unix. The methodology provides many object-oriented features and associated benefits. It also provides all the benefits of a C/Unix environment, including portability, a rich variety of development tools, and efficiency.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"149 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122018547","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}