{"title":"基于内隐注视的注释,支持第二语言学习","authors":"Ayano Okoso, K. Kunze, K. Kise","doi":"10.1145/2638728.2638783","DOIUrl":null,"url":null,"abstract":"This paper explores if implicit gaze based annotations can support reading comprehension tasks of second language learners. We show how to use eye tracking to add implicit annotations to the text the user reads and we start by annotating physical features (reading speed, re-reading, number of fixation areas) to documents using eye tracking. We show initial results of an ongoing experiment. So far, we recorded the eye gaze of 2 students for 2 documents. We gather initial feedback by presenting the annotated documents to two English teachers. Overall, we believe implicit annotations can be a useful feedback mechanism for second language learners.","PeriodicalId":20496,"journal":{"name":"Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2014-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":"{\"title\":\"Implicit gaze based annotations to support second language learning\",\"authors\":\"Ayano Okoso, K. Kunze, K. Kise\",\"doi\":\"10.1145/2638728.2638783\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper explores if implicit gaze based annotations can support reading comprehension tasks of second language learners. We show how to use eye tracking to add implicit annotations to the text the user reads and we start by annotating physical features (reading speed, re-reading, number of fixation areas) to documents using eye tracking. We show initial results of an ongoing experiment. So far, we recorded the eye gaze of 2 students for 2 documents. We gather initial feedback by presenting the annotated documents to two English teachers. Overall, we believe implicit annotations can be a useful feedback mechanism for second language learners.\",\"PeriodicalId\":20496,\"journal\":{\"name\":\"Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"15\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2638728.2638783\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2638728.2638783","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Implicit gaze based annotations to support second language learning
This paper explores if implicit gaze based annotations can support reading comprehension tasks of second language learners. We show how to use eye tracking to add implicit annotations to the text the user reads and we start by annotating physical features (reading speed, re-reading, number of fixation areas) to documents using eye tracking. We show initial results of an ongoing experiment. So far, we recorded the eye gaze of 2 students for 2 documents. We gather initial feedback by presenting the annotated documents to two English teachers. Overall, we believe implicit annotations can be a useful feedback mechanism for second language learners.