{"title":"信息检索中的再现性专题导论","authors":"N. Ferro, N. Fuhr, A. Rauber","doi":"10.1145/3268408","DOIUrl":null,"url":null,"abstract":"Information Retrieval (IR) is a discipline that has been strongly rooted in experimentation since its inception. Experimental evaluation has always been a strong driver for IR research and innovation, and these activities have been shaped by large-scale evaluation campaigns such as Text REtrieval Conference (TREC) in the US, Conference and Labs of the Evaluation Forum (CLEF) in Europe, NII Testbeds and Community for Information access Research (NTCIR) in Japan and Asia, and Forum for Information Retrieval Evaluation (FIRE) in India. IR systems are becoming increasingly complex. They need to cross language and media barriers; they span from unstructured, via semi-structured, to highly structured data; and they are faced with diverse, complex, and frequently underspecified (ambiguously specified) information needs, search tasks, and societal challenges. As a consequence, evaluation and experimentation, which has remained a fundamental element, has in turn become increasingly sophisticated and challenging.","PeriodicalId":15582,"journal":{"name":"Journal of Data and Information Quality (JDIQ)","volume":"72 1","pages":"1 - 4"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Introduction to the Special Issue on Reproducibility in Information Retrieval\",\"authors\":\"N. Ferro, N. Fuhr, A. Rauber\",\"doi\":\"10.1145/3268408\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Information Retrieval (IR) is a discipline that has been strongly rooted in experimentation since its inception. Experimental evaluation has always been a strong driver for IR research and innovation, and these activities have been shaped by large-scale evaluation campaigns such as Text REtrieval Conference (TREC) in the US, Conference and Labs of the Evaluation Forum (CLEF) in Europe, NII Testbeds and Community for Information access Research (NTCIR) in Japan and Asia, and Forum for Information Retrieval Evaluation (FIRE) in India. IR systems are becoming increasingly complex. They need to cross language and media barriers; they span from unstructured, via semi-structured, to highly structured data; and they are faced with diverse, complex, and frequently underspecified (ambiguously specified) information needs, search tasks, and societal challenges. As a consequence, evaluation and experimentation, which has remained a fundamental element, has in turn become increasingly sophisticated and challenging.\",\"PeriodicalId\":15582,\"journal\":{\"name\":\"Journal of Data and Information Quality (JDIQ)\",\"volume\":\"72 1\",\"pages\":\"1 - 4\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-10-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Data and Information Quality (JDIQ)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3268408\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Data and Information Quality (JDIQ)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3268408","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Introduction to the Special Issue on Reproducibility in Information Retrieval
Information Retrieval (IR) is a discipline that has been strongly rooted in experimentation since its inception. Experimental evaluation has always been a strong driver for IR research and innovation, and these activities have been shaped by large-scale evaluation campaigns such as Text REtrieval Conference (TREC) in the US, Conference and Labs of the Evaluation Forum (CLEF) in Europe, NII Testbeds and Community for Information access Research (NTCIR) in Japan and Asia, and Forum for Information Retrieval Evaluation (FIRE) in India. IR systems are becoming increasingly complex. They need to cross language and media barriers; they span from unstructured, via semi-structured, to highly structured data; and they are faced with diverse, complex, and frequently underspecified (ambiguously specified) information needs, search tasks, and societal challenges. As a consequence, evaluation and experimentation, which has remained a fundamental element, has in turn become increasingly sophisticated and challenging.