{"title":"使用现有的学校信息平台通知家长孩子的出勤情况","authors":"Tareena Musaddiq, Alexa Prettyman, Jonathan Smith","doi":"10.1080/19345747.2023.2264841","DOIUrl":null,"url":null,"abstract":"AbstractSchool attendance is strongly associated with academic success and high school completion, but approximately one-in-seven students miss nearly one month of school each year. To address absenteeism, we partnered with four public school districts in the metro-Atlanta area and experimentally deployed email and text messages to inform parents about their child’s attendance. Parents received personalized monthly messages through the school districts’ existing messaging platforms that had zero marginal cost per message. The messages informed parents about their child’s number of absences and how that number compared to absences of their peers. For most parents, this information was delivered through email as opposed to text, and parents of students most in need of improved attendance were the hardest to reach. Intent-to-treat estimates show the intervention reduced end-of-year absences by four-tenths to two-thirds of a day (2 to 3%) and reduced the probability of chronic absenteeism by 2 to 6%, while actually receiving the messages reduced end-of-year absences by two-thirds to almost one day (3 to 4%) and reduced the probability of chronic absenteeism by 4 to 7%.Keywords: Chronic absenteeismattendance nudgesdistrict communication Disclosure StatementNo potential conflict of interest was reported by the author(s).Notes1 We use “parent” as a general term to represent a student’s caregiver, whether a parent or legal guardian.2 Our study only uses the district-wide messaging platform to contact parents. Parents may be contacted by school administration and teachers through various other ways (e.g., classroom apps) independent of this district-wide system. Our study does not include those means of communication and cannot speak to their effectiveness.3 Subsample analyses in the two districts that implemented the experiment with fidelity reveal similar improvements in attendance for students in grades K-8 (but not high school), and across gender, race, and most other demographic characteristics. However, we do not find a statistically significant improvement for Hispanic students or ELL students, despite the fact that the message was translated into the parents’ preferred language in the messaging platform.4 See Table A1 for a full list of studies and details in recent years.5 See, for example, Frey and Meier (Citation2004), Shang and Croson (Citation2009), Ferraro et al. (Citation2011), and Coffman et al. (Citation2017).6 See Nguyen (Citation2008), Jensen (Citation2010), Oreopoulos and Dunn (Citation2013), Dinkelman and Martínez (2014) for evidence on how education outcomes improve after parents or students are informed about the returns to, or costs of, educational investments. Bettinger et al. (Citation2012) is an example in which information alone was insufficient for improving educational attainment.7 For a more detailed review on nudges in education, please see Damgaard and Nielsen (Citation2018).8 See Table A1 for details about the location, timing, target population, intervention, and results for recent studies that have used nudge theory to influence attendance behavior.9 Authors’ calculation based on the interactive data visualization tool for chronic absences across the United States, which is available at: http://www.hamiltonproject.org/charts/chronic_absence_across_the_united_states.10 U.S. Census. https://patch.com/georgia/atlanta/how-georgia-education-spending-ranks-nationwide-census-bureau. https://www.census.gov/library/visualizations/2017/comm/cb17-97-public-education-finance.html11 School Messenger offers a variety of services to the school districts including mobile apps, student emails, and school websites. For more information see www.schoolmessenger.com. Blackboard is a Learning Management System that allows students and teachers to access learning resources online, view course contents and grades, and participate in online discussion forums. For more information, see www.blackboard.com/k12/index.html.12 To define our target sample of students, we followed the U.S. Department of Education in defining chronic absenteeism as missing 15 days of school regardless of being excused or unexcused (see: https://www2.ed.gov/datastory/chronicabsenteeism.html#intro). The state of Georgia, however, defines chronic absenteeism as missing 10% or more of school days in a year, which is roughly 18 or more days in a year.13 At the end of the school year, 73% of the control group had 15 or more absences, indicating that the linear model was fairly accurate in predicting chronic absenteeism.14 In one district, where the process varied slightly, only students with valid parental contact details were selected to be part of the experiment and assigned to control and treatment groups. Additionally, students indicated as medically fragile were removed from the experimental sample.15 Moreover, at the request of the districts, we assigned the minimum number of students needed to detect modest effect sizes to the treatment, as opposed to splitting the experimental group evenly, which would have maximized statistical power. Based on our reading of the literature and our discussions with the district, we used a one-day decrease in absences as the target.16 We conduct sensitivity analyses for the attrition rates, baseline balance, and main results excluding this district. Using Figure A1, the overall attrition rate for Districts A, B, and C is 0.213. The attrition rates in the control and treatment groups are 0.233 and 0.160, respectively. The baseline imbalance on initial absences is in the opposite direction of the imbalance on initial absences in District D (see Table 2). The main results are smaller in magnitude and imprecisely estimated (see Table A5).17 One district’s opt-out message read: “Thank you for agreeing to participate in our study to improve school attendance at [Your District]. If you’d like to stop receiving messages about your child’s absences or you are receiving this in error, please fill out the information below. You will not receive any further communications about this study, but you will still receive other district related communications (weather closings, school/district announcements, meal balances, etc.). Please be sure to include the email and/or phone number to which you received this initial message.”18 The messages varied slightly across districts due to district-specific needs and preferences. The timing of the first message also varied across districts. Table A2 provides details of the differences across districts. Of particular note, District D, which sees the biggest effect, sent a text message that reminds parents to check their email, which contains the above details.19 After the first month of messages, we did not send messages the following months if a student’s year-to-date absences decreased relative to their previous month or if their percentile rank was less than 50%. These 1.5% of treatment students were likely a result of updated administrative records, and we wanted to avoid sending inconsistent or inaccurate messages.20 Race/ethnicity are not mutually exclusive.21 Using the state definition to measure chronic absenteeism (i.e. missing at least 10% of days enrolled), 61% of the students in the experimental group across all four districts were chronically absent.22 See Table A4 for more summary statistics by district of the experimental group relative to the non-experimental group.23 See Table A4 for the demographic characteristics of the students in the non-experimental group, control and treatment groups, broken down by district.24 Weights are calculated separately for each district and sum to the size of the control group within the district. These weights reduce the imbalance on initial absences (see the weighted balance check in Table A6), and the weighted results in Table A7 are statistically indistinguishable from the main results.25 One district did not send texts and instead made robocalls. See Table A3 for more details about implementation. We interpret the results from a robocall similar to a text since the communication is via phone. The delivery status for a robocall is either answered, answering machine, or invalid phone number. A robocall was considered received if the delivery status was answered or answering machine.26 Per the SchoolMessenger Communicate user guide, “sent” indicates that the message was sent, but SchoolMessenger has not received verification from the recipient’s email server, “delivered” indicates that the message was sent and SchoolMessenger has received confirmation that the recipient’s email server successfully queued the message for delivery, and “opened” indicates the message was opened by the recipient. We do not differentiate between “sent”, “delivered”, and “opened” because some of this classification is a function of when the delivery report was pulled. In districts that pulled the delivery report immediately after sending the message, there are fewer “opened” messages than in districts that pulled the delivery report later.27 Enrollment and absence data for 97% of the students come directly from the school districts. The attendance data for the remaining 3% of students were missing, so we use data about and from the districts that is collected and cleaned by the state. The two measures almost perfectly align in data in overlapping observations, but we still control for an indicator for the source of the data.28 Our level of randomization was at the individual level; therefore, we use robust standard errors and do not cluster at a higher aggregation. We do however check for the statistical significance of our estimates to clustering standard errors by (i) school, (ii) grade by district, (iii) grade, and (iv) district. Our estimates remain robust in all the cases.29 These effect sizes are statistically significant at the 99% confidence level for the districts that implemented with fidelity and at the 95% confidence level for all four districts. Among Districts B, C, and D the intent-to-treat effect size is a 0.552-day reduction and the treatment-on-treated effect size is 0.791-day reduction. Both estimates are statistically significant at the 99% confidence level.30 Not only did District D implement with fidelity, but they provide the most statistical power because all students in the experiment had valid contact information for their parents.31 This can be seen through the proportion of students’ parents who received the first message, first text, last message, and last text in Table A4. In addition, Figure A2 and Figure A3 graph the probability of receiving the first and last message via any mode of contact or text, respectively, by the number of initial days absent.32 We measure chronic absenteeism using both the national and state definitions. Under the national definition, students who were absent 15 or more days from school in a year are indicated as chronically absent, and under the state definition, students who missed 10% or more of the days they were enrolled are indicated as chronically absent. The results do not differ across definitions.33 Due to data limitations, this analysis is only available for the two districts that implemented the experiment with fidelity.Additional informationFundingThe authors would like to thank Arnold Ventures for generously supporting this study through a Policy Labs grant. Tareena Musaddiq’s postdoctoral fellowship was supported by training grant R305B170015 from the U.S. Department of Education’s Institute of Education Sciences. The opinions expressed in this research are those of the researchers and are not attributable to the school districts. The intervention was pre-registered on Open Science Framework and can be accessed here: https://osf.io/n6jwy.","PeriodicalId":47260,"journal":{"name":"Journal of Research on Educational Effectiveness","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2023-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Using Existing School Messaging Platforms to Inform Parents about Their Child’s Attendance\",\"authors\":\"Tareena Musaddiq, Alexa Prettyman, Jonathan Smith\",\"doi\":\"10.1080/19345747.2023.2264841\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"AbstractSchool attendance is strongly associated with academic success and high school completion, but approximately one-in-seven students miss nearly one month of school each year. To address absenteeism, we partnered with four public school districts in the metro-Atlanta area and experimentally deployed email and text messages to inform parents about their child’s attendance. Parents received personalized monthly messages through the school districts’ existing messaging platforms that had zero marginal cost per message. The messages informed parents about their child’s number of absences and how that number compared to absences of their peers. For most parents, this information was delivered through email as opposed to text, and parents of students most in need of improved attendance were the hardest to reach. Intent-to-treat estimates show the intervention reduced end-of-year absences by four-tenths to two-thirds of a day (2 to 3%) and reduced the probability of chronic absenteeism by 2 to 6%, while actually receiving the messages reduced end-of-year absences by two-thirds to almost one day (3 to 4%) and reduced the probability of chronic absenteeism by 4 to 7%.Keywords: Chronic absenteeismattendance nudgesdistrict communication Disclosure StatementNo potential conflict of interest was reported by the author(s).Notes1 We use “parent” as a general term to represent a student’s caregiver, whether a parent or legal guardian.2 Our study only uses the district-wide messaging platform to contact parents. Parents may be contacted by school administration and teachers through various other ways (e.g., classroom apps) independent of this district-wide system. Our study does not include those means of communication and cannot speak to their effectiveness.3 Subsample analyses in the two districts that implemented the experiment with fidelity reveal similar improvements in attendance for students in grades K-8 (but not high school), and across gender, race, and most other demographic characteristics. However, we do not find a statistically significant improvement for Hispanic students or ELL students, despite the fact that the message was translated into the parents’ preferred language in the messaging platform.4 See Table A1 for a full list of studies and details in recent years.5 See, for example, Frey and Meier (Citation2004), Shang and Croson (Citation2009), Ferraro et al. (Citation2011), and Coffman et al. (Citation2017).6 See Nguyen (Citation2008), Jensen (Citation2010), Oreopoulos and Dunn (Citation2013), Dinkelman and Martínez (2014) for evidence on how education outcomes improve after parents or students are informed about the returns to, or costs of, educational investments. Bettinger et al. (Citation2012) is an example in which information alone was insufficient for improving educational attainment.7 For a more detailed review on nudges in education, please see Damgaard and Nielsen (Citation2018).8 See Table A1 for details about the location, timing, target population, intervention, and results for recent studies that have used nudge theory to influence attendance behavior.9 Authors’ calculation based on the interactive data visualization tool for chronic absences across the United States, which is available at: http://www.hamiltonproject.org/charts/chronic_absence_across_the_united_states.10 U.S. Census. https://patch.com/georgia/atlanta/how-georgia-education-spending-ranks-nationwide-census-bureau. https://www.census.gov/library/visualizations/2017/comm/cb17-97-public-education-finance.html11 School Messenger offers a variety of services to the school districts including mobile apps, student emails, and school websites. For more information see www.schoolmessenger.com. Blackboard is a Learning Management System that allows students and teachers to access learning resources online, view course contents and grades, and participate in online discussion forums. For more information, see www.blackboard.com/k12/index.html.12 To define our target sample of students, we followed the U.S. Department of Education in defining chronic absenteeism as missing 15 days of school regardless of being excused or unexcused (see: https://www2.ed.gov/datastory/chronicabsenteeism.html#intro). The state of Georgia, however, defines chronic absenteeism as missing 10% or more of school days in a year, which is roughly 18 or more days in a year.13 At the end of the school year, 73% of the control group had 15 or more absences, indicating that the linear model was fairly accurate in predicting chronic absenteeism.14 In one district, where the process varied slightly, only students with valid parental contact details were selected to be part of the experiment and assigned to control and treatment groups. Additionally, students indicated as medically fragile were removed from the experimental sample.15 Moreover, at the request of the districts, we assigned the minimum number of students needed to detect modest effect sizes to the treatment, as opposed to splitting the experimental group evenly, which would have maximized statistical power. Based on our reading of the literature and our discussions with the district, we used a one-day decrease in absences as the target.16 We conduct sensitivity analyses for the attrition rates, baseline balance, and main results excluding this district. Using Figure A1, the overall attrition rate for Districts A, B, and C is 0.213. The attrition rates in the control and treatment groups are 0.233 and 0.160, respectively. The baseline imbalance on initial absences is in the opposite direction of the imbalance on initial absences in District D (see Table 2). The main results are smaller in magnitude and imprecisely estimated (see Table A5).17 One district’s opt-out message read: “Thank you for agreeing to participate in our study to improve school attendance at [Your District]. If you’d like to stop receiving messages about your child’s absences or you are receiving this in error, please fill out the information below. You will not receive any further communications about this study, but you will still receive other district related communications (weather closings, school/district announcements, meal balances, etc.). Please be sure to include the email and/or phone number to which you received this initial message.”18 The messages varied slightly across districts due to district-specific needs and preferences. The timing of the first message also varied across districts. Table A2 provides details of the differences across districts. Of particular note, District D, which sees the biggest effect, sent a text message that reminds parents to check their email, which contains the above details.19 After the first month of messages, we did not send messages the following months if a student’s year-to-date absences decreased relative to their previous month or if their percentile rank was less than 50%. These 1.5% of treatment students were likely a result of updated administrative records, and we wanted to avoid sending inconsistent or inaccurate messages.20 Race/ethnicity are not mutually exclusive.21 Using the state definition to measure chronic absenteeism (i.e. missing at least 10% of days enrolled), 61% of the students in the experimental group across all four districts were chronically absent.22 See Table A4 for more summary statistics by district of the experimental group relative to the non-experimental group.23 See Table A4 for the demographic characteristics of the students in the non-experimental group, control and treatment groups, broken down by district.24 Weights are calculated separately for each district and sum to the size of the control group within the district. These weights reduce the imbalance on initial absences (see the weighted balance check in Table A6), and the weighted results in Table A7 are statistically indistinguishable from the main results.25 One district did not send texts and instead made robocalls. See Table A3 for more details about implementation. We interpret the results from a robocall similar to a text since the communication is via phone. The delivery status for a robocall is either answered, answering machine, or invalid phone number. A robocall was considered received if the delivery status was answered or answering machine.26 Per the SchoolMessenger Communicate user guide, “sent” indicates that the message was sent, but SchoolMessenger has not received verification from the recipient’s email server, “delivered” indicates that the message was sent and SchoolMessenger has received confirmation that the recipient’s email server successfully queued the message for delivery, and “opened” indicates the message was opened by the recipient. We do not differentiate between “sent”, “delivered”, and “opened” because some of this classification is a function of when the delivery report was pulled. In districts that pulled the delivery report immediately after sending the message, there are fewer “opened” messages than in districts that pulled the delivery report later.27 Enrollment and absence data for 97% of the students come directly from the school districts. The attendance data for the remaining 3% of students were missing, so we use data about and from the districts that is collected and cleaned by the state. The two measures almost perfectly align in data in overlapping observations, but we still control for an indicator for the source of the data.28 Our level of randomization was at the individual level; therefore, we use robust standard errors and do not cluster at a higher aggregation. We do however check for the statistical significance of our estimates to clustering standard errors by (i) school, (ii) grade by district, (iii) grade, and (iv) district. Our estimates remain robust in all the cases.29 These effect sizes are statistically significant at the 99% confidence level for the districts that implemented with fidelity and at the 95% confidence level for all four districts. Among Districts B, C, and D the intent-to-treat effect size is a 0.552-day reduction and the treatment-on-treated effect size is 0.791-day reduction. Both estimates are statistically significant at the 99% confidence level.30 Not only did District D implement with fidelity, but they provide the most statistical power because all students in the experiment had valid contact information for their parents.31 This can be seen through the proportion of students’ parents who received the first message, first text, last message, and last text in Table A4. In addition, Figure A2 and Figure A3 graph the probability of receiving the first and last message via any mode of contact or text, respectively, by the number of initial days absent.32 We measure chronic absenteeism using both the national and state definitions. Under the national definition, students who were absent 15 or more days from school in a year are indicated as chronically absent, and under the state definition, students who missed 10% or more of the days they were enrolled are indicated as chronically absent. The results do not differ across definitions.33 Due to data limitations, this analysis is only available for the two districts that implemented the experiment with fidelity.Additional informationFundingThe authors would like to thank Arnold Ventures for generously supporting this study through a Policy Labs grant. Tareena Musaddiq’s postdoctoral fellowship was supported by training grant R305B170015 from the U.S. Department of Education’s Institute of Education Sciences. The opinions expressed in this research are those of the researchers and are not attributable to the school districts. The intervention was pre-registered on Open Science Framework and can be accessed here: https://osf.io/n6jwy.\",\"PeriodicalId\":47260,\"journal\":{\"name\":\"Journal of Research on Educational Effectiveness\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2023-11-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Research on Educational Effectiveness\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/19345747.2023.2264841\",\"RegionNum\":4,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Research on Educational Effectiveness","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19345747.2023.2264841","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 2
摘要
摘要学校出勤率与学业成功和高中学业结业密切相关,但大约七分之一的学生每年缺课近一个月。为了解决缺勤问题,我们与亚特兰大大都会地区的四个公立学区合作,试验性地使用电子邮件和短信通知家长他们孩子的出勤情况。家长们通过学区现有的短信平台每月收到个性化的短信,每条短信的边际成本为零。这些信息告诉父母他们孩子的缺勤次数,以及这个数字与同龄人的缺勤次数的比较。对于大多数家长来说,这些信息是通过电子邮件而不是短信传递的,而那些最需要提高出勤率的学生的家长是最难联系到的。意向治疗估计表明,干预措施将年底缺勤减少了十分之四至三分之二(2%至3%),并将慢性缺勤的概率降低了2%至6%,而实际接收信息将年底缺勤减少了三分之二至近一天(3%至4%),并将慢性缺勤的概率降低了4%至7%。关键词:慢性缺勤、护理轻推、地区沟通披露声明作者未报告潜在的利益冲突。注1:我们使用“家长”作为一个通用术语来代表学生的照顾者,无论是父母还是法定监护人我们的研究只使用全区范围的信息平台与家长联系。学校管理部门和教师可以通过各种其他方式(例如课堂应用程序)与家长联系,而不是通过这个学区系统。我们的研究没有包括那些交流手段,因此不能说它们的有效性在两个忠实地实施了实验的地区进行的子样本分析显示,K-8年级(但不是高中)学生的出勤率也有类似的提高,而且不分性别、种族和大多数其他人口统计学特征。然而,我们没有发现西班牙裔学生或英语学生在统计上有显著的改善,尽管在消息传递平台上,消息被翻译成父母喜欢的语言表A1载列近年的研究及详情例如,参见Frey和Meier (Citation2004), Shang和Croson (Citation2009), Ferraro等人(Citation2011)和Coffman等人(Citation2017)参见Nguyen (Citation2008)、Jensen (Citation2010)、Oreopoulos和Dunn (Citation2013)、Dinkelman和Martínez(2014),了解在家长或学生了解教育投资的回报或成本后,教育成果如何得到改善的证据。Bettinger et al. (Citation2012)就是一个例子,其中信息本身不足以提高教育程度有关教育中助推的更详细评论,请参见Damgaard和Nielsen (Citation2018)参见表A1,详细说明了使用轻推理论影响出勤行为的地点、时间、目标人群、干预措施和近期研究的结果作者的计算基于美国长期缺勤的交互式数据可视化工具,该工具可在http://www.hamiltonproject.org/charts/chronic_absence_across_the_united_states.10 U.S. Census上获得。https://patch.com/georgia/atlanta/how-georgia-education-spending-ranks-nationwide-census-bureau。https://www.census.gov/library/visualizations/2017/comm/cb17-97-public-education-finance.html11 School Messenger为学区提供各种服务,包括移动应用程序,学生电子邮件和学校网站。欲了解更多信息,请参阅www.schoolmessenger.com。Blackboard是一个学习管理系统,允许学生和教师访问在线学习资源,查看课程内容和成绩,并参与在线论坛。要了解更多信息,请参见www.blackboard.com/k12/index.html.12为了定义我们的目标学生样本,我们遵循了美国教育部对长期缺勤的定义,即缺课15天,无论是否有理由(参见https://www2.ed.gov/datastory/chronicabsenteeism.html#intro)。然而,乔治亚州将长期缺勤定义为一年中缺课天数超过10%,也就是一年中缺课天数超过18天在学年结束时,73%的对照组有15次或更多的缺勤,这表明线性模型在预测慢性缺勤方面是相当准确的在一个地区,这个过程略有不同,只有有有效父母联系方式的学生才被选为实验的一部分,并被分配到对照组和实验组。此外,从实验样本中删除了医学上脆弱的学生。
Using Existing School Messaging Platforms to Inform Parents about Their Child’s Attendance
AbstractSchool attendance is strongly associated with academic success and high school completion, but approximately one-in-seven students miss nearly one month of school each year. To address absenteeism, we partnered with four public school districts in the metro-Atlanta area and experimentally deployed email and text messages to inform parents about their child’s attendance. Parents received personalized monthly messages through the school districts’ existing messaging platforms that had zero marginal cost per message. The messages informed parents about their child’s number of absences and how that number compared to absences of their peers. For most parents, this information was delivered through email as opposed to text, and parents of students most in need of improved attendance were the hardest to reach. Intent-to-treat estimates show the intervention reduced end-of-year absences by four-tenths to two-thirds of a day (2 to 3%) and reduced the probability of chronic absenteeism by 2 to 6%, while actually receiving the messages reduced end-of-year absences by two-thirds to almost one day (3 to 4%) and reduced the probability of chronic absenteeism by 4 to 7%.Keywords: Chronic absenteeismattendance nudgesdistrict communication Disclosure StatementNo potential conflict of interest was reported by the author(s).Notes1 We use “parent” as a general term to represent a student’s caregiver, whether a parent or legal guardian.2 Our study only uses the district-wide messaging platform to contact parents. Parents may be contacted by school administration and teachers through various other ways (e.g., classroom apps) independent of this district-wide system. Our study does not include those means of communication and cannot speak to their effectiveness.3 Subsample analyses in the two districts that implemented the experiment with fidelity reveal similar improvements in attendance for students in grades K-8 (but not high school), and across gender, race, and most other demographic characteristics. However, we do not find a statistically significant improvement for Hispanic students or ELL students, despite the fact that the message was translated into the parents’ preferred language in the messaging platform.4 See Table A1 for a full list of studies and details in recent years.5 See, for example, Frey and Meier (Citation2004), Shang and Croson (Citation2009), Ferraro et al. (Citation2011), and Coffman et al. (Citation2017).6 See Nguyen (Citation2008), Jensen (Citation2010), Oreopoulos and Dunn (Citation2013), Dinkelman and Martínez (2014) for evidence on how education outcomes improve after parents or students are informed about the returns to, or costs of, educational investments. Bettinger et al. (Citation2012) is an example in which information alone was insufficient for improving educational attainment.7 For a more detailed review on nudges in education, please see Damgaard and Nielsen (Citation2018).8 See Table A1 for details about the location, timing, target population, intervention, and results for recent studies that have used nudge theory to influence attendance behavior.9 Authors’ calculation based on the interactive data visualization tool for chronic absences across the United States, which is available at: http://www.hamiltonproject.org/charts/chronic_absence_across_the_united_states.10 U.S. Census. https://patch.com/georgia/atlanta/how-georgia-education-spending-ranks-nationwide-census-bureau. https://www.census.gov/library/visualizations/2017/comm/cb17-97-public-education-finance.html11 School Messenger offers a variety of services to the school districts including mobile apps, student emails, and school websites. For more information see www.schoolmessenger.com. Blackboard is a Learning Management System that allows students and teachers to access learning resources online, view course contents and grades, and participate in online discussion forums. For more information, see www.blackboard.com/k12/index.html.12 To define our target sample of students, we followed the U.S. Department of Education in defining chronic absenteeism as missing 15 days of school regardless of being excused or unexcused (see: https://www2.ed.gov/datastory/chronicabsenteeism.html#intro). The state of Georgia, however, defines chronic absenteeism as missing 10% or more of school days in a year, which is roughly 18 or more days in a year.13 At the end of the school year, 73% of the control group had 15 or more absences, indicating that the linear model was fairly accurate in predicting chronic absenteeism.14 In one district, where the process varied slightly, only students with valid parental contact details were selected to be part of the experiment and assigned to control and treatment groups. Additionally, students indicated as medically fragile were removed from the experimental sample.15 Moreover, at the request of the districts, we assigned the minimum number of students needed to detect modest effect sizes to the treatment, as opposed to splitting the experimental group evenly, which would have maximized statistical power. Based on our reading of the literature and our discussions with the district, we used a one-day decrease in absences as the target.16 We conduct sensitivity analyses for the attrition rates, baseline balance, and main results excluding this district. Using Figure A1, the overall attrition rate for Districts A, B, and C is 0.213. The attrition rates in the control and treatment groups are 0.233 and 0.160, respectively. The baseline imbalance on initial absences is in the opposite direction of the imbalance on initial absences in District D (see Table 2). The main results are smaller in magnitude and imprecisely estimated (see Table A5).17 One district’s opt-out message read: “Thank you for agreeing to participate in our study to improve school attendance at [Your District]. If you’d like to stop receiving messages about your child’s absences or you are receiving this in error, please fill out the information below. You will not receive any further communications about this study, but you will still receive other district related communications (weather closings, school/district announcements, meal balances, etc.). Please be sure to include the email and/or phone number to which you received this initial message.”18 The messages varied slightly across districts due to district-specific needs and preferences. The timing of the first message also varied across districts. Table A2 provides details of the differences across districts. Of particular note, District D, which sees the biggest effect, sent a text message that reminds parents to check their email, which contains the above details.19 After the first month of messages, we did not send messages the following months if a student’s year-to-date absences decreased relative to their previous month or if their percentile rank was less than 50%. These 1.5% of treatment students were likely a result of updated administrative records, and we wanted to avoid sending inconsistent or inaccurate messages.20 Race/ethnicity are not mutually exclusive.21 Using the state definition to measure chronic absenteeism (i.e. missing at least 10% of days enrolled), 61% of the students in the experimental group across all four districts were chronically absent.22 See Table A4 for more summary statistics by district of the experimental group relative to the non-experimental group.23 See Table A4 for the demographic characteristics of the students in the non-experimental group, control and treatment groups, broken down by district.24 Weights are calculated separately for each district and sum to the size of the control group within the district. These weights reduce the imbalance on initial absences (see the weighted balance check in Table A6), and the weighted results in Table A7 are statistically indistinguishable from the main results.25 One district did not send texts and instead made robocalls. See Table A3 for more details about implementation. We interpret the results from a robocall similar to a text since the communication is via phone. The delivery status for a robocall is either answered, answering machine, or invalid phone number. A robocall was considered received if the delivery status was answered or answering machine.26 Per the SchoolMessenger Communicate user guide, “sent” indicates that the message was sent, but SchoolMessenger has not received verification from the recipient’s email server, “delivered” indicates that the message was sent and SchoolMessenger has received confirmation that the recipient’s email server successfully queued the message for delivery, and “opened” indicates the message was opened by the recipient. We do not differentiate between “sent”, “delivered”, and “opened” because some of this classification is a function of when the delivery report was pulled. In districts that pulled the delivery report immediately after sending the message, there are fewer “opened” messages than in districts that pulled the delivery report later.27 Enrollment and absence data for 97% of the students come directly from the school districts. The attendance data for the remaining 3% of students were missing, so we use data about and from the districts that is collected and cleaned by the state. The two measures almost perfectly align in data in overlapping observations, but we still control for an indicator for the source of the data.28 Our level of randomization was at the individual level; therefore, we use robust standard errors and do not cluster at a higher aggregation. We do however check for the statistical significance of our estimates to clustering standard errors by (i) school, (ii) grade by district, (iii) grade, and (iv) district. Our estimates remain robust in all the cases.29 These effect sizes are statistically significant at the 99% confidence level for the districts that implemented with fidelity and at the 95% confidence level for all four districts. Among Districts B, C, and D the intent-to-treat effect size is a 0.552-day reduction and the treatment-on-treated effect size is 0.791-day reduction. Both estimates are statistically significant at the 99% confidence level.30 Not only did District D implement with fidelity, but they provide the most statistical power because all students in the experiment had valid contact information for their parents.31 This can be seen through the proportion of students’ parents who received the first message, first text, last message, and last text in Table A4. In addition, Figure A2 and Figure A3 graph the probability of receiving the first and last message via any mode of contact or text, respectively, by the number of initial days absent.32 We measure chronic absenteeism using both the national and state definitions. Under the national definition, students who were absent 15 or more days from school in a year are indicated as chronically absent, and under the state definition, students who missed 10% or more of the days they were enrolled are indicated as chronically absent. The results do not differ across definitions.33 Due to data limitations, this analysis is only available for the two districts that implemented the experiment with fidelity.Additional informationFundingThe authors would like to thank Arnold Ventures for generously supporting this study through a Policy Labs grant. Tareena Musaddiq’s postdoctoral fellowship was supported by training grant R305B170015 from the U.S. Department of Education’s Institute of Education Sciences. The opinions expressed in this research are those of the researchers and are not attributable to the school districts. The intervention was pre-registered on Open Science Framework and can be accessed here: https://osf.io/n6jwy.
期刊介绍:
As the flagship publication for the Society for Research on Educational Effectiveness, the Journal of Research on Educational Effectiveness (JREE) publishes original articles from the multidisciplinary community of researchers who are committed to applying principles of scientific inquiry to the study of educational problems. Articles published in JREE should advance our knowledge of factors important for educational success and/or improve our ability to conduct further disciplined studies of pressing educational problems. JREE welcomes manuscripts that fit into one of the following categories: (1) intervention, evaluation, and policy studies; (2) theory, contexts, and mechanisms; and (3) methodological studies. The first category includes studies that focus on process and implementation and seek to demonstrate causal claims in educational research. The second category includes meta-analyses and syntheses, descriptive studies that illuminate educational conditions and contexts, and studies that rigorously investigate education processes and mechanism. The third category includes studies that advance our understanding of theoretical and technical features of measurement and research design and describe advances in data analysis and data modeling. To establish a stronger connection between scientific evidence and educational practice, studies submitted to JREE should focus on pressing problems found in classrooms and schools. Studies that help advance our understanding and demonstrate effectiveness related to challenges in reading, mathematics education, and science education are especially welcome as are studies related to cognitive functions, social processes, organizational factors, and cultural features that mediate and/or moderate critical educational outcomes. On occasion, invited responses to JREE articles and rejoinders to those responses will be included in an issue.