{"title":"Using Existing School Messaging Platforms to Inform Parents about Their Child’s Attendance","authors":"Tareena Musaddiq, Alexa Prettyman, Jonathan Smith","doi":"10.1080/19345747.2023.2264841","DOIUrl":null,"url":null,"abstract":"AbstractSchool attendance is strongly associated with academic success and high school completion, but approximately one-in-seven students miss nearly one month of school each year. To address absenteeism, we partnered with four public school districts in the metro-Atlanta area and experimentally deployed email and text messages to inform parents about their child’s attendance. Parents received personalized monthly messages through the school districts’ existing messaging platforms that had zero marginal cost per message. The messages informed parents about their child’s number of absences and how that number compared to absences of their peers. For most parents, this information was delivered through email as opposed to text, and parents of students most in need of improved attendance were the hardest to reach. Intent-to-treat estimates show the intervention reduced end-of-year absences by four-tenths to two-thirds of a day (2 to 3%) and reduced the probability of chronic absenteeism by 2 to 6%, while actually receiving the messages reduced end-of-year absences by two-thirds to almost one day (3 to 4%) and reduced the probability of chronic absenteeism by 4 to 7%.Keywords: Chronic absenteeismattendance nudgesdistrict communication Disclosure StatementNo potential conflict of interest was reported by the author(s).Notes1 We use “parent” as a general term to represent a student’s caregiver, whether a parent or legal guardian.2 Our study only uses the district-wide messaging platform to contact parents. Parents may be contacted by school administration and teachers through various other ways (e.g., classroom apps) independent of this district-wide system. Our study does not include those means of communication and cannot speak to their effectiveness.3 Subsample analyses in the two districts that implemented the experiment with fidelity reveal similar improvements in attendance for students in grades K-8 (but not high school), and across gender, race, and most other demographic characteristics. However, we do not find a statistically significant improvement for Hispanic students or ELL students, despite the fact that the message was translated into the parents’ preferred language in the messaging platform.4 See Table A1 for a full list of studies and details in recent years.5 See, for example, Frey and Meier (Citation2004), Shang and Croson (Citation2009), Ferraro et al. (Citation2011), and Coffman et al. (Citation2017).6 See Nguyen (Citation2008), Jensen (Citation2010), Oreopoulos and Dunn (Citation2013), Dinkelman and Martínez (2014) for evidence on how education outcomes improve after parents or students are informed about the returns to, or costs of, educational investments. Bettinger et al. (Citation2012) is an example in which information alone was insufficient for improving educational attainment.7 For a more detailed review on nudges in education, please see Damgaard and Nielsen (Citation2018).8 See Table A1 for details about the location, timing, target population, intervention, and results for recent studies that have used nudge theory to influence attendance behavior.9 Authors’ calculation based on the interactive data visualization tool for chronic absences across the United States, which is available at: http://www.hamiltonproject.org/charts/chronic_absence_across_the_united_states.10 U.S. Census. https://patch.com/georgia/atlanta/how-georgia-education-spending-ranks-nationwide-census-bureau. https://www.census.gov/library/visualizations/2017/comm/cb17-97-public-education-finance.html11 School Messenger offers a variety of services to the school districts including mobile apps, student emails, and school websites. For more information see www.schoolmessenger.com. Blackboard is a Learning Management System that allows students and teachers to access learning resources online, view course contents and grades, and participate in online discussion forums. For more information, see www.blackboard.com/k12/index.html.12 To define our target sample of students, we followed the U.S. Department of Education in defining chronic absenteeism as missing 15 days of school regardless of being excused or unexcused (see: https://www2.ed.gov/datastory/chronicabsenteeism.html#intro). The state of Georgia, however, defines chronic absenteeism as missing 10% or more of school days in a year, which is roughly 18 or more days in a year.13 At the end of the school year, 73% of the control group had 15 or more absences, indicating that the linear model was fairly accurate in predicting chronic absenteeism.14 In one district, where the process varied slightly, only students with valid parental contact details were selected to be part of the experiment and assigned to control and treatment groups. Additionally, students indicated as medically fragile were removed from the experimental sample.15 Moreover, at the request of the districts, we assigned the minimum number of students needed to detect modest effect sizes to the treatment, as opposed to splitting the experimental group evenly, which would have maximized statistical power. Based on our reading of the literature and our discussions with the district, we used a one-day decrease in absences as the target.16 We conduct sensitivity analyses for the attrition rates, baseline balance, and main results excluding this district. Using Figure A1, the overall attrition rate for Districts A, B, and C is 0.213. The attrition rates in the control and treatment groups are 0.233 and 0.160, respectively. The baseline imbalance on initial absences is in the opposite direction of the imbalance on initial absences in District D (see Table 2). The main results are smaller in magnitude and imprecisely estimated (see Table A5).17 One district’s opt-out message read: “Thank you for agreeing to participate in our study to improve school attendance at [Your District]. If you’d like to stop receiving messages about your child’s absences or you are receiving this in error, please fill out the information below. You will not receive any further communications about this study, but you will still receive other district related communications (weather closings, school/district announcements, meal balances, etc.). Please be sure to include the email and/or phone number to which you received this initial message.”18 The messages varied slightly across districts due to district-specific needs and preferences. The timing of the first message also varied across districts. Table A2 provides details of the differences across districts. Of particular note, District D, which sees the biggest effect, sent a text message that reminds parents to check their email, which contains the above details.19 After the first month of messages, we did not send messages the following months if a student’s year-to-date absences decreased relative to their previous month or if their percentile rank was less than 50%. These 1.5% of treatment students were likely a result of updated administrative records, and we wanted to avoid sending inconsistent or inaccurate messages.20 Race/ethnicity are not mutually exclusive.21 Using the state definition to measure chronic absenteeism (i.e. missing at least 10% of days enrolled), 61% of the students in the experimental group across all four districts were chronically absent.22 See Table A4 for more summary statistics by district of the experimental group relative to the non-experimental group.23 See Table A4 for the demographic characteristics of the students in the non-experimental group, control and treatment groups, broken down by district.24 Weights are calculated separately for each district and sum to the size of the control group within the district. These weights reduce the imbalance on initial absences (see the weighted balance check in Table A6), and the weighted results in Table A7 are statistically indistinguishable from the main results.25 One district did not send texts and instead made robocalls. See Table A3 for more details about implementation. We interpret the results from a robocall similar to a text since the communication is via phone. The delivery status for a robocall is either answered, answering machine, or invalid phone number. A robocall was considered received if the delivery status was answered or answering machine.26 Per the SchoolMessenger Communicate user guide, “sent” indicates that the message was sent, but SchoolMessenger has not received verification from the recipient’s email server, “delivered” indicates that the message was sent and SchoolMessenger has received confirmation that the recipient’s email server successfully queued the message for delivery, and “opened” indicates the message was opened by the recipient. We do not differentiate between “sent”, “delivered”, and “opened” because some of this classification is a function of when the delivery report was pulled. In districts that pulled the delivery report immediately after sending the message, there are fewer “opened” messages than in districts that pulled the delivery report later.27 Enrollment and absence data for 97% of the students come directly from the school districts. The attendance data for the remaining 3% of students were missing, so we use data about and from the districts that is collected and cleaned by the state. The two measures almost perfectly align in data in overlapping observations, but we still control for an indicator for the source of the data.28 Our level of randomization was at the individual level; therefore, we use robust standard errors and do not cluster at a higher aggregation. We do however check for the statistical significance of our estimates to clustering standard errors by (i) school, (ii) grade by district, (iii) grade, and (iv) district. Our estimates remain robust in all the cases.29 These effect sizes are statistically significant at the 99% confidence level for the districts that implemented with fidelity and at the 95% confidence level for all four districts. Among Districts B, C, and D the intent-to-treat effect size is a 0.552-day reduction and the treatment-on-treated effect size is 0.791-day reduction. Both estimates are statistically significant at the 99% confidence level.30 Not only did District D implement with fidelity, but they provide the most statistical power because all students in the experiment had valid contact information for their parents.31 This can be seen through the proportion of students’ parents who received the first message, first text, last message, and last text in Table A4. In addition, Figure A2 and Figure A3 graph the probability of receiving the first and last message via any mode of contact or text, respectively, by the number of initial days absent.32 We measure chronic absenteeism using both the national and state definitions. Under the national definition, students who were absent 15 or more days from school in a year are indicated as chronically absent, and under the state definition, students who missed 10% or more of the days they were enrolled are indicated as chronically absent. The results do not differ across definitions.33 Due to data limitations, this analysis is only available for the two districts that implemented the experiment with fidelity.Additional informationFundingThe authors would like to thank Arnold Ventures for generously supporting this study through a Policy Labs grant. Tareena Musaddiq’s postdoctoral fellowship was supported by training grant R305B170015 from the U.S. Department of Education’s Institute of Education Sciences. The opinions expressed in this research are those of the researchers and are not attributable to the school districts. The intervention was pre-registered on Open Science Framework and can be accessed here: https://osf.io/n6jwy.","PeriodicalId":47260,"journal":{"name":"Journal of Research on Educational Effectiveness","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2023-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Research on Educational Effectiveness","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19345747.2023.2264841","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 2
Abstract
AbstractSchool attendance is strongly associated with academic success and high school completion, but approximately one-in-seven students miss nearly one month of school each year. To address absenteeism, we partnered with four public school districts in the metro-Atlanta area and experimentally deployed email and text messages to inform parents about their child’s attendance. Parents received personalized monthly messages through the school districts’ existing messaging platforms that had zero marginal cost per message. The messages informed parents about their child’s number of absences and how that number compared to absences of their peers. For most parents, this information was delivered through email as opposed to text, and parents of students most in need of improved attendance were the hardest to reach. Intent-to-treat estimates show the intervention reduced end-of-year absences by four-tenths to two-thirds of a day (2 to 3%) and reduced the probability of chronic absenteeism by 2 to 6%, while actually receiving the messages reduced end-of-year absences by two-thirds to almost one day (3 to 4%) and reduced the probability of chronic absenteeism by 4 to 7%.Keywords: Chronic absenteeismattendance nudgesdistrict communication Disclosure StatementNo potential conflict of interest was reported by the author(s).Notes1 We use “parent” as a general term to represent a student’s caregiver, whether a parent or legal guardian.2 Our study only uses the district-wide messaging platform to contact parents. Parents may be contacted by school administration and teachers through various other ways (e.g., classroom apps) independent of this district-wide system. Our study does not include those means of communication and cannot speak to their effectiveness.3 Subsample analyses in the two districts that implemented the experiment with fidelity reveal similar improvements in attendance for students in grades K-8 (but not high school), and across gender, race, and most other demographic characteristics. However, we do not find a statistically significant improvement for Hispanic students or ELL students, despite the fact that the message was translated into the parents’ preferred language in the messaging platform.4 See Table A1 for a full list of studies and details in recent years.5 See, for example, Frey and Meier (Citation2004), Shang and Croson (Citation2009), Ferraro et al. (Citation2011), and Coffman et al. (Citation2017).6 See Nguyen (Citation2008), Jensen (Citation2010), Oreopoulos and Dunn (Citation2013), Dinkelman and Martínez (2014) for evidence on how education outcomes improve after parents or students are informed about the returns to, or costs of, educational investments. Bettinger et al. (Citation2012) is an example in which information alone was insufficient for improving educational attainment.7 For a more detailed review on nudges in education, please see Damgaard and Nielsen (Citation2018).8 See Table A1 for details about the location, timing, target population, intervention, and results for recent studies that have used nudge theory to influence attendance behavior.9 Authors’ calculation based on the interactive data visualization tool for chronic absences across the United States, which is available at: http://www.hamiltonproject.org/charts/chronic_absence_across_the_united_states.10 U.S. Census. https://patch.com/georgia/atlanta/how-georgia-education-spending-ranks-nationwide-census-bureau. https://www.census.gov/library/visualizations/2017/comm/cb17-97-public-education-finance.html11 School Messenger offers a variety of services to the school districts including mobile apps, student emails, and school websites. For more information see www.schoolmessenger.com. Blackboard is a Learning Management System that allows students and teachers to access learning resources online, view course contents and grades, and participate in online discussion forums. For more information, see www.blackboard.com/k12/index.html.12 To define our target sample of students, we followed the U.S. Department of Education in defining chronic absenteeism as missing 15 days of school regardless of being excused or unexcused (see: https://www2.ed.gov/datastory/chronicabsenteeism.html#intro). The state of Georgia, however, defines chronic absenteeism as missing 10% or more of school days in a year, which is roughly 18 or more days in a year.13 At the end of the school year, 73% of the control group had 15 or more absences, indicating that the linear model was fairly accurate in predicting chronic absenteeism.14 In one district, where the process varied slightly, only students with valid parental contact details were selected to be part of the experiment and assigned to control and treatment groups. Additionally, students indicated as medically fragile were removed from the experimental sample.15 Moreover, at the request of the districts, we assigned the minimum number of students needed to detect modest effect sizes to the treatment, as opposed to splitting the experimental group evenly, which would have maximized statistical power. Based on our reading of the literature and our discussions with the district, we used a one-day decrease in absences as the target.16 We conduct sensitivity analyses for the attrition rates, baseline balance, and main results excluding this district. Using Figure A1, the overall attrition rate for Districts A, B, and C is 0.213. The attrition rates in the control and treatment groups are 0.233 and 0.160, respectively. The baseline imbalance on initial absences is in the opposite direction of the imbalance on initial absences in District D (see Table 2). The main results are smaller in magnitude and imprecisely estimated (see Table A5).17 One district’s opt-out message read: “Thank you for agreeing to participate in our study to improve school attendance at [Your District]. If you’d like to stop receiving messages about your child’s absences or you are receiving this in error, please fill out the information below. You will not receive any further communications about this study, but you will still receive other district related communications (weather closings, school/district announcements, meal balances, etc.). Please be sure to include the email and/or phone number to which you received this initial message.”18 The messages varied slightly across districts due to district-specific needs and preferences. The timing of the first message also varied across districts. Table A2 provides details of the differences across districts. Of particular note, District D, which sees the biggest effect, sent a text message that reminds parents to check their email, which contains the above details.19 After the first month of messages, we did not send messages the following months if a student’s year-to-date absences decreased relative to their previous month or if their percentile rank was less than 50%. These 1.5% of treatment students were likely a result of updated administrative records, and we wanted to avoid sending inconsistent or inaccurate messages.20 Race/ethnicity are not mutually exclusive.21 Using the state definition to measure chronic absenteeism (i.e. missing at least 10% of days enrolled), 61% of the students in the experimental group across all four districts were chronically absent.22 See Table A4 for more summary statistics by district of the experimental group relative to the non-experimental group.23 See Table A4 for the demographic characteristics of the students in the non-experimental group, control and treatment groups, broken down by district.24 Weights are calculated separately for each district and sum to the size of the control group within the district. These weights reduce the imbalance on initial absences (see the weighted balance check in Table A6), and the weighted results in Table A7 are statistically indistinguishable from the main results.25 One district did not send texts and instead made robocalls. See Table A3 for more details about implementation. We interpret the results from a robocall similar to a text since the communication is via phone. The delivery status for a robocall is either answered, answering machine, or invalid phone number. A robocall was considered received if the delivery status was answered or answering machine.26 Per the SchoolMessenger Communicate user guide, “sent” indicates that the message was sent, but SchoolMessenger has not received verification from the recipient’s email server, “delivered” indicates that the message was sent and SchoolMessenger has received confirmation that the recipient’s email server successfully queued the message for delivery, and “opened” indicates the message was opened by the recipient. We do not differentiate between “sent”, “delivered”, and “opened” because some of this classification is a function of when the delivery report was pulled. In districts that pulled the delivery report immediately after sending the message, there are fewer “opened” messages than in districts that pulled the delivery report later.27 Enrollment and absence data for 97% of the students come directly from the school districts. The attendance data for the remaining 3% of students were missing, so we use data about and from the districts that is collected and cleaned by the state. The two measures almost perfectly align in data in overlapping observations, but we still control for an indicator for the source of the data.28 Our level of randomization was at the individual level; therefore, we use robust standard errors and do not cluster at a higher aggregation. We do however check for the statistical significance of our estimates to clustering standard errors by (i) school, (ii) grade by district, (iii) grade, and (iv) district. Our estimates remain robust in all the cases.29 These effect sizes are statistically significant at the 99% confidence level for the districts that implemented with fidelity and at the 95% confidence level for all four districts. Among Districts B, C, and D the intent-to-treat effect size is a 0.552-day reduction and the treatment-on-treated effect size is 0.791-day reduction. Both estimates are statistically significant at the 99% confidence level.30 Not only did District D implement with fidelity, but they provide the most statistical power because all students in the experiment had valid contact information for their parents.31 This can be seen through the proportion of students’ parents who received the first message, first text, last message, and last text in Table A4. In addition, Figure A2 and Figure A3 graph the probability of receiving the first and last message via any mode of contact or text, respectively, by the number of initial days absent.32 We measure chronic absenteeism using both the national and state definitions. Under the national definition, students who were absent 15 or more days from school in a year are indicated as chronically absent, and under the state definition, students who missed 10% or more of the days they were enrolled are indicated as chronically absent. The results do not differ across definitions.33 Due to data limitations, this analysis is only available for the two districts that implemented the experiment with fidelity.Additional informationFundingThe authors would like to thank Arnold Ventures for generously supporting this study through a Policy Labs grant. Tareena Musaddiq’s postdoctoral fellowship was supported by training grant R305B170015 from the U.S. Department of Education’s Institute of Education Sciences. The opinions expressed in this research are those of the researchers and are not attributable to the school districts. The intervention was pre-registered on Open Science Framework and can be accessed here: https://osf.io/n6jwy.
期刊介绍:
As the flagship publication for the Society for Research on Educational Effectiveness, the Journal of Research on Educational Effectiveness (JREE) publishes original articles from the multidisciplinary community of researchers who are committed to applying principles of scientific inquiry to the study of educational problems. Articles published in JREE should advance our knowledge of factors important for educational success and/or improve our ability to conduct further disciplined studies of pressing educational problems. JREE welcomes manuscripts that fit into one of the following categories: (1) intervention, evaluation, and policy studies; (2) theory, contexts, and mechanisms; and (3) methodological studies. The first category includes studies that focus on process and implementation and seek to demonstrate causal claims in educational research. The second category includes meta-analyses and syntheses, descriptive studies that illuminate educational conditions and contexts, and studies that rigorously investigate education processes and mechanism. The third category includes studies that advance our understanding of theoretical and technical features of measurement and research design and describe advances in data analysis and data modeling. To establish a stronger connection between scientific evidence and educational practice, studies submitted to JREE should focus on pressing problems found in classrooms and schools. Studies that help advance our understanding and demonstrate effectiveness related to challenges in reading, mathematics education, and science education are especially welcome as are studies related to cognitive functions, social processes, organizational factors, and cultural features that mediate and/or moderate critical educational outcomes. On occasion, invited responses to JREE articles and rejoinders to those responses will be included in an issue.