{"title":"提高测试结果的透明度:建立开放的集体知识库","authors":"Lahcène Brahimi, Yassine Ouhammou, Ladjel Bellatreche, Abdelkader Ouared","doi":"10.1109/RCIS.2016.7549353","DOIUrl":null,"url":null,"abstract":"We are currently witnessing an explosion of advances in database technology, that cover all phases of database application design: non-functional requirements, conceptual modeling, logical modeling, deployment, physical design and exploitation. Researchers and engineers cooperate to integrate these advances in the database design. Their proposed solutions have to be confronted with similar studies whose results have been either published in scientific papers or on specific websites such as that of TPC (the Transaction Processing Council). Recently, several researchers have highlighted difficulties in reproducing the results of existing studies. As a consequence, certain research communities require that the environment and the results of the testing activities have to be published in order to facilitate their reproduction, to publish their simulator environments, in order to allow evaluators to perform by their own the experiments. This is quite similar to the Volkswagen fake pollution controls scandal. In this paper, we firstly, advocate the transparency of testing in the database field. Secondly, we propose the use of a repository dedicated to store testing environment and results. The environment includes used data sets, deployment platform, non-functional requirements, used algorithms, hypotheses, etc. and the results completed with their measurement units.","PeriodicalId":344289,"journal":{"name":"2016 IEEE Tenth International Conference on Research Challenges in Information Science (RCIS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"More transparency in testing results: Towards an open collective knowledge base\",\"authors\":\"Lahcène Brahimi, Yassine Ouhammou, Ladjel Bellatreche, Abdelkader Ouared\",\"doi\":\"10.1109/RCIS.2016.7549353\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We are currently witnessing an explosion of advances in database technology, that cover all phases of database application design: non-functional requirements, conceptual modeling, logical modeling, deployment, physical design and exploitation. Researchers and engineers cooperate to integrate these advances in the database design. Their proposed solutions have to be confronted with similar studies whose results have been either published in scientific papers or on specific websites such as that of TPC (the Transaction Processing Council). Recently, several researchers have highlighted difficulties in reproducing the results of existing studies. As a consequence, certain research communities require that the environment and the results of the testing activities have to be published in order to facilitate their reproduction, to publish their simulator environments, in order to allow evaluators to perform by their own the experiments. This is quite similar to the Volkswagen fake pollution controls scandal. In this paper, we firstly, advocate the transparency of testing in the database field. Secondly, we propose the use of a repository dedicated to store testing environment and results. The environment includes used data sets, deployment platform, non-functional requirements, used algorithms, hypotheses, etc. and the results completed with their measurement units.\",\"PeriodicalId\":344289,\"journal\":{\"name\":\"2016 IEEE Tenth International Conference on Research Challenges in Information Science (RCIS)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE Tenth International Conference on Research Challenges in Information Science (RCIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/RCIS.2016.7549353\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE Tenth International Conference on Research Challenges in Information Science (RCIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RCIS.2016.7549353","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
More transparency in testing results: Towards an open collective knowledge base
We are currently witnessing an explosion of advances in database technology, that cover all phases of database application design: non-functional requirements, conceptual modeling, logical modeling, deployment, physical design and exploitation. Researchers and engineers cooperate to integrate these advances in the database design. Their proposed solutions have to be confronted with similar studies whose results have been either published in scientific papers or on specific websites such as that of TPC (the Transaction Processing Council). Recently, several researchers have highlighted difficulties in reproducing the results of existing studies. As a consequence, certain research communities require that the environment and the results of the testing activities have to be published in order to facilitate their reproduction, to publish their simulator environments, in order to allow evaluators to perform by their own the experiments. This is quite similar to the Volkswagen fake pollution controls scandal. In this paper, we firstly, advocate the transparency of testing in the database field. Secondly, we propose the use of a repository dedicated to store testing environment and results. The environment includes used data sets, deployment platform, non-functional requirements, used algorithms, hypotheses, etc. and the results completed with their measurement units.