A. D'Amato, L. B. Ruiz, A. F. Silva, J. C. D. Costa
{"title":"EProf: An Accurate Energy Consumption Estimation Tool","authors":"A. D'Amato, L. B. Ruiz, A. F. Silva, J. C. D. Costa","doi":"10.1109/SCCC.2011.28","DOIUrl":"https://doi.org/10.1109/SCCC.2011.28","url":null,"abstract":"Energy efficiency is a primary concern in Wireless Sensor Networks because in general, wireless sensor nodes run on non-rechargeable batteries. Therefore, the estimating a expected node lifetime is indispensable. Thus, energy is an important issue that should be optimized by the application developer. However, in order for the developer to realize that the program may not be developed in accordance with this power restriction, it is important to estimate the energy that will be consumed by the application. In this paper we present Eprof, a cycle-accurate instruction level simulator developed to estimate the energy consumption of applications running on UNB-RISC16 processor. Eprof is able to measure detailed time-critical phenomena, and can be used to shed new light on design issues for large-scale sensor network applications.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123182544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
W. Carvalho, P. F. Rosa, M. S. Soares, Marco Antonio Teixeira da Cunha, Luiz Carlos Buiatte
{"title":"A Comparative Analysis of the Agile and Traditional Software Development Processes Productivity","authors":"W. Carvalho, P. F. Rosa, M. S. Soares, Marco Antonio Teixeira da Cunha, Luiz Carlos Buiatte","doi":"10.1109/SCCC.2011.11","DOIUrl":"https://doi.org/10.1109/SCCC.2011.11","url":null,"abstract":"Software development processes are essential for an organization to obtain the required levels of productivity and quality. The productivity analysis of agile and traditional development processes is an open and few explored research area, which has attracted the interest of industrial and academic fellows in order to take advantage of the strengths of both approaches. This research aims to investigate good options for agile and traditional integration by defining a hybrid process that takes advantage of both approaches. An empirical study aiming to evaluate the productivity impact of the proposed hybrid process was carried out in a Brazilian CMMI v.1.1 Maturity Level 2 medium-sized company. Five groups of similar projects were compared with respect to productivity, some of which were developed using the new hybrid process and others were developed using the older RUP-based process. Quantitative results have shown that four out of five project groups showed significant productivity increase in Scrum-RUP projects. The study shows that it is possible to integrate agile practices in the software development process without losing the rigor needed in the desired sub processes and still get real development productivity gain.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124917493","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alexander Rodríguez, Alicia Diaz-Zea, R. Flores, Medardo Delgado Paredes, D. Barrios-Aranibar, Raquel Patiño
{"title":"Argopecten Purpuratus Codification Based on Determination of Weight by Conversion and Adjustment Factors","authors":"Alexander Rodríguez, Alicia Diaz-Zea, R. Flores, Medardo Delgado Paredes, D. Barrios-Aranibar, Raquel Patiño","doi":"10.1109/SCCC.2011.20","DOIUrl":"https://doi.org/10.1109/SCCC.2011.20","url":null,"abstract":"The Codification of Argopecten Purpuratus is a process, where the Stem and Coral are classified by their weight in different codes. This process is done manually, therefore is linked to the subjectivity and the fatigue of people involved in the work. The use of computer vision is an alternative to automate this process. The present work proposes a method to classify the Argopecten Purpuratus based on determination of weights by conversion and adjustment factors. These factors use the area of the whole scallop and of the coral to make the estimation. Results of experiments show that the computer vision system achieved an overall acccuracy of 98%.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"522 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123413394","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Defining and Validating Virtual Worlds Usability Heuristics","authors":"R. Muñoz-Soto, T. Barcelos, Virgínia Chalegre","doi":"10.1109/SCCC.2011.23","DOIUrl":"https://doi.org/10.1109/SCCC.2011.23","url":null,"abstract":"Virtual Worlds (VWs) are computer simulated environments, in two- or three-dimensions (3D), where hundreds of people can interact with each other in real time. Current heuristic evaluation can potentially identify issues related to the usability of VWs, but it may miss domain specific problems. For this reason, the development and use of appropriate heuristics is highly significant. This paper describes the development and validation process of a novel set of usability heuristics for VWs applications and an associated checklist. An experimental evaluation showed that the new heuristic set can potentially identify usability problems that are more closely related to the VW context and that would not be found using the traditional Nielsen's heuristics.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127646844","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Process-Service Interactions Using a SOA-BPM-Based Methodology","authors":"P. Bazán, G. Perez, R. Giandini, Javier Diaz","doi":"10.1109/SCCC.2011.14","DOIUrl":"https://doi.org/10.1109/SCCC.2011.14","url":null,"abstract":"Reducing the gap between the definition, modeling and management of business definition, modeling and management of business processes and the realization of the main processes functions through software services requires a methodological approach. Addressing such gap, this paper focuses on the interactions between the activities of a business process and the functionality provided by software services. It presents a meta-model to formally specify such interactions. The proposed meta-model is instantiated through a case study. Lessons learnt through the development of an example were used to define a graphic editor prototype, as an automated tool for managing interactions between business processes and software services. The proposed approach enables the alternative application of top-down and bottom-up analysis techniques for modeling processes and services, and assists developers with an integrated tool for modeling business processes and software services in a seamlessly combined way.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115369108","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Eric Orellana-Romero, Javier SanMartin-Hernandez, C. Duran-Faundez, V. Lecuire, Cristhian Aguilera
{"title":"Sim-LIT: A Simulation Framework for Image Quality Assessment in Wireless Visual Sensor Networks under Packet Loss Conditions","authors":"Eric Orellana-Romero, Javier SanMartin-Hernandez, C. Duran-Faundez, V. Lecuire, Cristhian Aguilera","doi":"10.1109/SCCC.2011.27","DOIUrl":"https://doi.org/10.1109/SCCC.2011.27","url":null,"abstract":"In this paper we present Sim-LIT, a framework for the simulation of packet loss effects on the quality of non-coded or coded still images transported over wireless sensor networks. The tool is focused on image quality assessment and it can be used to evaluate error resilience during image communications. In this first version the evaluation of block interleaving methods is provided. Sim-LIT is highly configurable, providing several options and additional tools. It may be useful to rapidly evaluate interleaving algorithms, or other techniques, or to perform extensive tests considering various image files and loss patterns. Through different simulations we demonstrate the potential of Sim-LIT as a tool for supporting research activities on the image processing and wireless sensor networks domains.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116023652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"CQA-WF: Consistent Query Answers to Conjunctive Queries Using the Well-Founded Semantics","authors":"Julia Belmar, J. Cuevas, Mónica Caniupán Marileo","doi":"10.1109/SCCC.2011.9","DOIUrl":"https://doi.org/10.1109/SCCC.2011.9","url":null,"abstract":"A database instance can become inconsistent with respect to its integrity constraints (ICs), for instance, after update operations. When this happens, it is possible to compute the repairs of the database. A minimal repair is a new database instance that satisfies the ICs, is obtained by applying update operations over the original instance, and differs minimally from the original instance. We can evaluate queries over the repairs, an answer to a conjunctive query is consistent if it is an answer in every repair. The repairs of database instances can be specified by repair programs. Moreover, we can compute consistent answers to queries by evaluating query programs together with the repair programs under the stable model semantics. The use of logic programs does not exceed the intrinsic complexity of consistent query answering. Nevertheless, for a certain class of conjunctive queries and ICs it is possible to use the alternative well-founded semantics (WFS) to evaluate queries. We present CQA-WF, a system that allows the computation of consistent answers to conjunctive queries over inconsistent databases with respect to functional dependencies (FDs). CQA-WF evaluates logic programs under the WFS. The WFS has lower data complexity than the stable models semantics.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129383669","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Enabling Situation-Aware Behavior in Web-Based Learning Systems","authors":"A. Pernas, J. Oliveira","doi":"10.1109/SCCC.2011.21","DOIUrl":"https://doi.org/10.1109/SCCC.2011.21","url":null,"abstract":"Adaptive systems need to be aware about the user context and the occurrence of specific events to properly react to. In particular, web-based educational systems have to be improved in order to consider the learner in his/her current situation. Thus, the main objective of this paper is to present a system able to recognize the situations configured in an e-learning environment, aiming at suggesting better actions to be followed by students each time are necessary, for example, when the learner encounters a difficulty to solve an exercise or spend a lot of time to finish a course. In this work, our solution is developed as an extension of an already functional learning environment, called AdaptWeb®, and enables situation-aware behavior customized to its audience. The main challenge is monitoring events and detecting situations.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121534583","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Data Warehouse Fixer: Fixing Inconsistencies in Data Warehouses","authors":"Mónica Caniupán Marileo, Alvaro Placencia","doi":"10.1109/SCCC.2011.5","DOIUrl":"https://doi.org/10.1109/SCCC.2011.5","url":null,"abstract":"Dimensions in Data Warehouses (DWs) are set of elements connected by a hierarchical relationship. Usually, dimensions are required to be strict and covering to support summarizations at different levels of granularity. A dimension is strict if all they rollup relations are functions, and is covering if every element in a category is connected with an element in its ancestor categories. We present the Data Warehouse Fixer (DWF), a system that restores consistency of dimensions that fail to satisfy their strictness constraints. The system checks consistency, computes minimal repairs for inconsistent dimensions by implementing Datalog programs with negation and weak constraints, and also fixes inconsistent dimensions.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127471691","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Effect of Combining Compiler Optimizations on Code Size","authors":"J. Foleiss, A. F. Silva, L. B. Ruiz","doi":"10.1109/SCCC.2011.25","DOIUrl":"https://doi.org/10.1109/SCCC.2011.25","url":null,"abstract":"Lately, our embedded computing research group has been working on code generation targeting small footprint and low energy consumption. In specific, we've been researching the effect of optimizations on code size. Previously, we've concluded that evaluating single optimizations does not lead to interesting results. That's due to the fact that optimization efficiency towards reducing code size is greatly affected by the combinations of optimization techniques being used. Therefore, it is necessary to identify which optimization techniques are heavily related, in respect to code size reduction. In this paper, we present some experimental evaluation results that reveal relationships among optimization techniques that yield to small binaries.","PeriodicalId":173639,"journal":{"name":"2011 30th International Conference of the Chilean Computer Science Society","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134570461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}