{"title":"Automated Generation of Use Case Descriptions from Problem Frames","authors":"A. Fatolahi, S. Somé, T. Lethbridge","doi":"10.1109/SERA.2010.36","DOIUrl":"https://doi.org/10.1109/SERA.2010.36","url":null,"abstract":"In order to reduce the risk of failure in software projects, it is critical to achieve a true understanding of the problem and requirements. Several requirements engineering tools and techniques have been proposed amongst which, problem-oriented approaches are recognized as techniques that start with problem analysis rather than solution analysis. Such approaches are distinguished by their ability of solving a software-related problem based on the category the problem falls into. In this paper, a mapping from problem frames (PF) as one problem-oriented approach to use cases as the most popular requirements engineering technique is provided. Bridging problem frames with use cases is important for benefiting from the popularity of use cases while observing the advantages of problem frames. It can also build up a trust to PF approaches within the software engineering community.","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"157 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133645287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Investigation of the Capability of XP to Support the Requirements of ISO 9001 Software Process Certification","authors":"Malik Qasaimeh, A. Abran","doi":"10.1109/SERA.2010.38","DOIUrl":"https://doi.org/10.1109/SERA.2010.38","url":null,"abstract":"For software organizations needing ISO 9001 certification, it is important to establish a software process life cycle that can manage the requirements imposed by this certification standard. This paper presents an analysis of extreme programming (XP) from the ISO 9001 and ISO 90003 perspectives. The focus is to extract the requirements related to the ISO product realization process and to determine the strengths and weaknesses of XP in handling those requirements.","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"519 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116263441","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Requirements Management Tool with Evolving Traceability for Heterogeneous Artifacts in the Entire Life Cycle","authors":"Youngki Hong, Minho Kim, Sang-Woong Lee","doi":"10.1109/SERA.2010.39","DOIUrl":"https://doi.org/10.1109/SERA.2010.39","url":null,"abstract":"There are significant factors in software project management, cost, effort, and quality. Much software engineering works have focused on these. When it comes to software quality, customer requirements are starting points for assuring quality in software development projects [3]. Currently, software engineering literature still finds effective ways to manage requirements within the entire project life cycle, but does not have a complete solution for it. Besides, some solutions are quite time consuming works for project management. At this point, tools for managing requirements help keeping specifications consistent, up-to-date and efficiently accessible. The purpose of this paper is to address the development of a new tool for requirements management with support for the evolution aspect of the grand challenges of traceability as well as making requirement specification and establishing traceability links. An approach to support the sustained evolution of traceability links is proposed and outlined. A fine-grained differencing approach on the link endpoints is used to maintain the links in a scalable manner. Details of the link model, representation and screen are given followed by the process used to evolve traceability links.","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123069645","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R. Dumke, Karsten Richter, K. Georgieva, E. Asfoura
{"title":"Process Improvement Based on Causal Networks","authors":"R. Dumke, Karsten Richter, K. Georgieva, E. Asfoura","doi":"10.1109/SERA.2010.43","DOIUrl":"https://doi.org/10.1109/SERA.2010.43","url":null,"abstract":"This paper includes a causal-based modelling of software measurement processes in order to clarify the empirical reasoning in software engineering field. A first overview about existing causal network approaches shows the problems and possible benefits using these formal techniques in the software engineering area. The definition and extension of the causal modelling using causal networks helps to understand the relationships between the different software process artefacts and their causalities. The causal network based process model (CNPM) concept is based on the causal network idea of Pearl. The description of first applications of the CNPM approach for CMMI demonstrates the empirical reasoning of the software improvement processes in an explicit manner.","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"330 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122096333","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xing Zhou, T. Dreibholz, M. Becke, Jobin Pulinthanath, E. Rathgeb, Wencai Du
{"title":"The Software Modeling and Implementation of Reliable Server Pooling and RSPLIB","authors":"Xing Zhou, T. Dreibholz, M. Becke, Jobin Pulinthanath, E. Rathgeb, Wencai Du","doi":"10.1109/SERA.2010.26","DOIUrl":"https://doi.org/10.1109/SERA.2010.26","url":null,"abstract":"With the growing complexity of software applications, there is an increasing demand for solutions to distribute workload into server pools. Grid Computing provides powerful -- but also highly complex -- mechanisms to realize such tasks. Also, there is a steadily growing number of downtime-critical applications, requiring redundant servers to ensure service availability in case of component failures. To cope with the demand for server redundancy and service availability, the IETF has recently standardized the lightweight Reliable Server Pooling (RSerPool) framework, which is a common architecture for server pool and session management. In this paper, we first introduce the concept of RSerPool and then present the modeling thoughts of RSPLIB and the underlying general groupware design. Based on RSPLIB, we will illustratively show how to easily develop applications on top of RSerPool. We will also offer an application evaluation example for a proof-of-concept setup to distribute ray-tracing computation workload into a compute pool.","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128278172","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Process Patterns for MDA-Based Software Development","authors":"Mohsen Asadi, N. Esfahani, Raman Ramsin","doi":"10.1109/SERA.2010.32","DOIUrl":"https://doi.org/10.1109/SERA.2010.32","url":null,"abstract":"Information systems are expected to satisfy increasingly ambitious requirements, while reducing time–to–market has become a primary objective. This trend has necessitated the advent of development approaches that are better equipped and flexible enough to cope with modern challenges. Model-Driven Architecture (MDA) and Situational Method Engineering (SME) are approaches addressing this requirement: MDA provides promising means for automating the software process, and revitalizes the role of modeling in software development, SME focuses on project-specific methodology construction, mainly through assembling reusable method fragments (process patterns) retrieved from a method base. We provide a set of high-level process patterns for model-driven development which have been derived from a study of six prominent MDAbased methodologies, and which form the basis for a proposed generic MDA Software Process (MDASP). These process patterns can promote SME by providing classes of common process components which can be used for assembling, tailoring, and extending MDA-based methodologies.","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128155778","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Applying the Psychometric Theory to Questionnaire-Based Appraisals for Software Process Improvement","authors":"I. García, C. Pacheco, Gabriel Andrade","doi":"10.1109/SERA.2010.33","DOIUrl":"https://doi.org/10.1109/SERA.2010.33","url":null,"abstract":"Software Process Assessments is a key factor within organizations to determine their current capability/maturity level and to adopt a Software Process Improvement initiative. Their chance of success using a standard model is determined by a reliable assessment of their current processes and in determining which processes need to be improved. However, the current existing automated tools for process assessments, can not verify the authenticity of answers, and are therefore limited in their reliability level depending only on the employees’ responses. This paper presents our research in psychometric theory applied to questionnaire-based appraisals, to determine the feasibility of combining them together in order to develop a reliable assessment mechanism providing more reliable evidence about the organizations’ current maturity/capability level. We have found that current existing tools implicity include some elements of psychometric theory but more can be included. Thus, this paper presents a first attempt to formally integrate the psychometric theory into the questionnaire-based appraisals.","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128911433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards Automated Monitoring and Forecasting of Probabilistic Quality Properties in Open Source Software (OSS): A Striking Hybrid Approach","authors":"R. Parizi, A. Ghani","doi":"10.1109/SERA.2010.48","DOIUrl":"https://doi.org/10.1109/SERA.2010.48","url":null,"abstract":"In this paper, we propose a hybrid approach based on the aspect-orientation methodology and time series analysis to the runtime monitoring and quality forecasting of OSS. Specifically, the major objective of this work is to combine the idea of time series analysis with the area of software quality assurance of OSS in which statistical techniques for analyzing of time series is used to facilitate the prediction and forecasting (the term ‘prediction’ and ‘forecasting’ are interchangeably used in the literature) of probabilistic quality properties, which are difficult or inapplicable to be evaluated by current approaches such as testing, and also help to increase the reliability and productivity of working OSS system components (towards trustworthy open source software development) requiring extreme runtime quality control. Furthermore, in order to reduce the human effort and to cope with more sophisticated scenarios, this study also aims to automate the analysis and modeling process by providing appropriate tool.","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116942692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Cross Engine Database Joining","authors":"Wesley Leonard, Paul B. Albee","doi":"10.1109/SERA.2010.13","DOIUrl":"https://doi.org/10.1109/SERA.2010.13","url":null,"abstract":"A standards-based, open-source middleware system was designed and implemented to facilitate the analysis of large and disparate datasets. This system makes it possible to access several different types of database servers simultaneously, browse remote data, combine datasets, and join tables from remote databases independent of vendor. The system uses an algorithm known as Dynamic Merge Cache to handle data caching, query generation, transformations, and joining with minimal operational interference to source databases. The system is able to combine any subset of configured databases and convert the information into XML. The resulting XML is made available to analysis tools through a web service. After the system connects to a remote database, a metadata catalog is created from the source database. The user is able to configure which tables and fields to export from the remote dataset. The user is also able to filter, transform, and combine data. The system was tested with a large fish contaminant database and a second database populated with simulated scientific data.","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130512471","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Masoume Jabbarifar, Alireza Shameli-Sendi, H. Pedram, M. Dehghan, M. Dagenais
{"title":"L-SYNC: Larger Degree Clustering Based Time-Synchronisation for Wireless Sensor Network","authors":"Masoume Jabbarifar, Alireza Shameli-Sendi, H. Pedram, M. Dehghan, M. Dagenais","doi":"10.1109/SERA.2010.30","DOIUrl":"https://doi.org/10.1109/SERA.2010.30","url":null,"abstract":"In many existing synchronization protocols within wireless sensor networks, the effect of routing algorithm in synchronization precision of two remote nodes is not being considered. In several protocols such as SLTP, this issue is considered for local time estimation of a remote node. Cluster creation is according to ID technique. This technique incurs an increase in cluster overlapping and eventually the routing algorithm will be affected and requires more hops to move from one cluster to another remote cluster. In this article, we present L-SYNC method, which creates large degree clusters for wireless sensor networks synchronization. Using large degree clustering, L-SYNC can reduce path hops. Also, L-SYNC uses linear regression method to calculate clock offset and skew in each cluster. Therefore, it is capable to compute skew and offset intervals between each node and its head cluster and, in other words, it can estimate the local time of remote nodes in future and past. To estimate the local time for remote nodes, routing algorithm is used and conversion technique is performed in each time changing hop. The fewer L-SYNC hops could increase the precision. Simulation results illustrate that monotonous clustering formation can increase the precision in synchronization. However, more overhead and time period are needed for clustering formation","PeriodicalId":102108,"journal":{"name":"2010 Eighth ACIS International Conference on Software Engineering Research, Management and Applications","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134397313","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}