{"title":"Toloka平台作为在线调查参与者的来源:评估数据质量的经验","authors":"K. Gavrilov","doi":"10.19181/4m.2021.53.5","DOIUrl":null,"url":null,"abstract":"The article presents the experience of using Yandex Toloka crowdsourcing platform to recruit respondents for an online survey. Analyzing methodological publications on a similar foreign platform Amazon Mechanical Turk we put forward hypotheses about the data quality obtained via Toloka in comparison with the results collected using other convenience sample types –online panels and recruitment of respondents through social networks. Additionally, only based on the Toloka data, we assessed the indicator of respondent’s attentiveness. The main conclusion is that Toloka allows to recruit respondents quickly and at low cost, and the results are comparable in terms of quality to those obtained by other methods. In particular, respondents from Toloka almost always complete the survey, fill out questionnaires faster than other types of respondents, but less often than participants of the online panel have a tendency to “straightline” (i.e., give the same answers in a tabular question), just as often as social media respondents give answers to the open-ended question (but less frequently than online panel participants), although their responses are shorter. Only 36% of the respondents passed the attention check question, attentive participants had a longer questionnaire complete time and were less likely to be straightliners. The increase of reward did not increase the proportion of attentive respondents, but decreased the questionnaire filling out speed, increased the number of answers to the open question, and reduced the proportion of straightliners.","PeriodicalId":271863,"journal":{"name":"Sociology: methodology, methods, mathematical modeling (Sociology: 4M)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Toloka platform as a source of online survey participants: an experience of assessing data quality\",\"authors\":\"K. Gavrilov\",\"doi\":\"10.19181/4m.2021.53.5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The article presents the experience of using Yandex Toloka crowdsourcing platform to recruit respondents for an online survey. Analyzing methodological publications on a similar foreign platform Amazon Mechanical Turk we put forward hypotheses about the data quality obtained via Toloka in comparison with the results collected using other convenience sample types –online panels and recruitment of respondents through social networks. Additionally, only based on the Toloka data, we assessed the indicator of respondent’s attentiveness. The main conclusion is that Toloka allows to recruit respondents quickly and at low cost, and the results are comparable in terms of quality to those obtained by other methods. In particular, respondents from Toloka almost always complete the survey, fill out questionnaires faster than other types of respondents, but less often than participants of the online panel have a tendency to “straightline” (i.e., give the same answers in a tabular question), just as often as social media respondents give answers to the open-ended question (but less frequently than online panel participants), although their responses are shorter. Only 36% of the respondents passed the attention check question, attentive participants had a longer questionnaire complete time and were less likely to be straightliners. The increase of reward did not increase the proportion of attentive respondents, but decreased the questionnaire filling out speed, increased the number of answers to the open question, and reduced the proportion of straightliners.\",\"PeriodicalId\":271863,\"journal\":{\"name\":\"Sociology: methodology, methods, mathematical modeling (Sociology: 4M)\",\"volume\":\"27 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Sociology: methodology, methods, mathematical modeling (Sociology: 4M)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.19181/4m.2021.53.5\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sociology: methodology, methods, mathematical modeling (Sociology: 4M)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.19181/4m.2021.53.5","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Toloka platform as a source of online survey participants: an experience of assessing data quality
The article presents the experience of using Yandex Toloka crowdsourcing platform to recruit respondents for an online survey. Analyzing methodological publications on a similar foreign platform Amazon Mechanical Turk we put forward hypotheses about the data quality obtained via Toloka in comparison with the results collected using other convenience sample types –online panels and recruitment of respondents through social networks. Additionally, only based on the Toloka data, we assessed the indicator of respondent’s attentiveness. The main conclusion is that Toloka allows to recruit respondents quickly and at low cost, and the results are comparable in terms of quality to those obtained by other methods. In particular, respondents from Toloka almost always complete the survey, fill out questionnaires faster than other types of respondents, but less often than participants of the online panel have a tendency to “straightline” (i.e., give the same answers in a tabular question), just as often as social media respondents give answers to the open-ended question (but less frequently than online panel participants), although their responses are shorter. Only 36% of the respondents passed the attention check question, attentive participants had a longer questionnaire complete time and were less likely to be straightliners. The increase of reward did not increase the proportion of attentive respondents, but decreased the questionnaire filling out speed, increased the number of answers to the open question, and reduced the proportion of straightliners.