What Educators and Policymakers Should Know
As many educators and parents know all too well, the summer is a key time in students’ social and cognitive development, and it plays an important role in the development of achievement gaps. As a result, summer interventions have the potential to not only mitigate summer learning loss but also reduce persistent achievement gaps.
In our chapter from The Summer Slide: What We Know and Can Do About Summer Learning Loss, from which this article is drawn, we reviewed a foundational meta-analysis of summer learning programs conducted by researchers as well as evidence from 25 studies of such programs since 2000. The programs covered in our review included voluntary at-home summer reading programs, voluntary classroom-based summer programs, and mandatory summer programs that students must attend to avoid in-grade retention.
The evidence suggests that many types of summer learning programs have the potential to reduce summer learning losses and perhaps create learning gains. However, implementing a summer program does not guarantee positive effects on students’ learning. A key question then is: What factors make a summer learning program effective?
Components of Quality Summer Learning Programs
Small Class Sizes
Research has found that small class size is associated with summer program effectiveness. One study found that summer programs with class size capped at 20 students were more effective than others in producing achievement gains.1 In another study, researchers found no statistically significant relationship between class size and program quality, but they found positive effects when small classes were combined with significant program resources (defined as class sizes of no more than 13, at least four hours of participation per day, and at least 70 hours of total participation).2 They analyzed 12 studies with enough detail to investigate whether program resources mediated students’ learning. Of those 12, the five studies that met these criteria had large statistically significant, positive effects on students’ learning, and the seven studies that did not meet the criteria had no statistically significant effect on students’ learning.
Other researchers similarly combined instructional hours with class size to test whether more individual attention offered due to smaller classes might improve results.3 Although they found a positive relationship between the number of hours of instructional time and math achievement, they did not find a relationship when it was further combined with class size. This may be because prevailing class sizes across the five studied districts were all small, ranging from an average of eight to 14 students per teacher. Furthermore, researchers found large positive effects of an intense summer literacy program on students’ reading outcomes.4 The program used daily small-group (three to five children), research-based instruction.
To sum up, programs with small classes and significant resources provide teachers with more time to work individually with students and to create greater opportunities to differentiate instruction based on student needs. Such programs may also be particularly beneficial during the summer, when teachers have much less time to get to know the students in their classrooms.
Aligned to Student Needs
Learning science recommends that in order to maximize the benefit of academic experiences, especially in literacy, students’ assignments should be well aligned to their interests and needs.5 Summer learning programs should therefore align instruction to school-year activities, and instruction should be tightly focused on addressing students’ needs with high-quality instruction.6 The findings from the many replications of Project READS,7 an at-home summer literacy intervention, clearly show that students are not only more likely to read over the summer when books are aligned to their interests and matched to their reading levels, but they are also more likely to comprehend what they are reading, and these comprehension effects persist into the following school year.
The results from Project READS also suggest that sending students books matched to their reading levels and interests over the summer with the expectation that they will read them is not enough. In the absence of a structured school setting, struggling students also need continued support during the summer. For example, researchers tested whether students who were given resources meant to mimic school-year learning opportunities outperformed students who were just given basic prompts to read books over the summer.8 They found that an approach that included a scaffolded summer reading intervention and prompts to read over the summer increased the amount of time students spent reading and improved their comprehension, relative to students who were either just mailed books home or not given any treatment (e.g., no scaffolding or books).
Finally, the Project READS work also tested whether incentives to read over the summer enhanced students’ summer reading habits and comprehension. Researchers tested two different treatments. In the first, students were supplied with books to read over the summer aligned to their skills and interests. In the second, students were given books and points for each book they read (that could be redeemed for toys, games, etc.).9 At the end of the summer, the intervention was effective only for motivated students (as measured by baseline surveys), and the use of incentives actually widened the achievement gap between motivated and unmotivated students. As such, it is important not only to align students’ work with their interests and ability levels, but also to build in structures to support learning during the summer, especially for at-home programs.
One study found a positive, statistically significant association between prior teaching experience and reading outcomes.10 Specifically, it found that students who had summer teachers who had just taught either their sending or receiving grade performed better than other students on a fall reading assessment. In order to recruit and hire the right teachers, researchers recommend developing rigorous selection processes to recruit motivated teachers and, to the extent possible, taking teachers’ school-year performance into consideration.11 They also stress the importance of hiring teachers with not only grade-level but also subject-matter experience and, if possible, familiarity with the students.
In addition to the importance of recruiting qualified teachers, the teachers’ instruction of the curriculum is important. In one study, researchers observed and evaluated instructional quality for each classroom in their study.12 Their analysis found a positive association between quality of instruction and better student performance in reading. (They did not find a relationship between quality of instruction and student performance in mathematics.) Furthermore, researchers examined voluntary and at-home literacy programs that used research-based instruction, such as guided repeated oral reading, that related readings to students’ prior experiences and explicitly modeled strategies for students.13 Programs that included these practices had significantly larger positive effects on students’ reading outcomes than programs that did not use such instructional practices.
In efforts to ensure high-quality instruction, researchers recommend anchoring summer literacy programs in an evidence-based curriculum;14 providing professional development to teachers;15 tying small-group instruction explicitly to learning goals;16 and providing teachers with instructional support, such as coaching, during the program.17
Researchers expected that students in more orderly sites would have better outcomes because they and their teachers would be less likely to be distracted by misbehavior.18 To evaluate student discipline and order in the district programs they studied, they created a scale for each site within each district based on teacher survey data. On the survey, teachers were asked for their observations of student bullying,* physical fighting, and other indicators of orderliness. They found that students who attended more orderly sites outperformed other students on the fall reading assessment.
Policies to Maximize Participation and Attendance
Consistent attendance is crucial not only for school-year learning but for summer learning as well.19 Researchers did not find differences in program effectiveness between summer programs that did and did not monitor attendance, so tracking attendance, while a good policy, is likely insufficient to increase attendance.20 To promote consistent attendance, researchers recommend setting enrollment deadlines, establishing a clear attendance policy, and providing field trips and other incentives for students who attend.21 They also found that it is not necessary to disguise academics to boost attendance: the district with the highest attendance rate in the study ran the most “school-like” program, with the most explicit academic instruction.
Researchers generally distinguish between allocated time (the time on the school calendar for a given content area) and academic learning time (the amount of time students spend working on rigorous tasks at the appropriate level of difficulty). Academic learning time is more predictive of student achievement.22 Furthermore, research also suggests that spaced practice (once a day for several days), as opposed to one long, concentrated lesson (all day long for just one day), appears to be more effective in facilitating learning.23 When focusing on boosting students’ literacy skills, researchers recommend that students receive at least two hours of teacher-directed daily instruction blended between whole-group and small-group (three to five students) lessons and that the program meet regularly during the week (four to five times) for at least five weeks.24
Similarly, researchers recommend that school districts plan for programs to run at least five weeks and schedule 60–90 minutes of mathematics per day to maximize effectiveness.25 Because instructional time on task is reduced due to student absences and inefficient use of time during the day, researchers suggest special efforts to promote consistent attendance, maintain daily schedules, and ensure teachers maximize instructional time in the classroom.
For educators, administrators, and policymakers looking to strengthen their summer learning programs, we suggest they keep the following information in mind. First, research shows that the effectiveness of summer learning programs is inconsistently influenced by students’ backgrounds and the grade level of the intervention. This implies that there is no “best” target population of students for summer programming. Furthermore, simply offering a program does not guarantee it will benefit students.
Second, research indicates that for summer programs to be effective, they must be of sufficient duration (i.e., at least five weeks long or 70 hours of academic programming) and achieve consistent student attendance. Students also benefit from individualized and aligned instruction and class sizes smaller than 20 students.
In addition, high-quality instruction (promoted through careful hiring and professional development) by teachers who have recently taught the sending or receiving grade contributes to positive student outcomes, as does providing that instruction in orderly summer sites with low levels of physical fighting or bullying.
It is our hope that this research encourages districts and providers to enact quality components and ensure effectiveness in carefully planning for summer programming.
Andrew McEachin is a policy researcher in the economics, statistics, and sociology department at the Rand Corporation. Catherine H. Augustine is a senior policy researcher and quality assurance manager at Rand. Jennifer McCombs is a senior policy researcher and the director of the behavioral and policy sciences department at Rand and a professor at the Pardee Rand Graduate School. This article was reprinted by permission of the publisher from Karl Alexander, Sarah Pitcock, and Matthew Boulay (eds.), The Summer Slide: What We Know and Can Do About Summer Learning Loss (New York: Teachers College Press). Copyright © 2016 by Teachers College, Columbia University. All rights reserved.
1. Harris Cooper et al., “Making the Most of Summer School: A Meta-Analytic and Narrative Review,” Monographs of the Society for Research in Child Development 65, no. 1 (2000).
2. James S. Kim and David M. Quinn, “The Effects of Summer Reading on Low-Income Children’s Literacy Achievement from Kindergarten to Grade 8: A Meta-Analysis of Classroom and Home Interventions,” Review of Educational Research 83 (2013): 386–431.
3. Jennifer Sloan McCombs et al., Ready for Fall? Near-Term Effects of Voluntary Summer Learning Programs on Low-Income Students’ Learning Opportunities and Outcomes, Rand Summer Learning Series (Santa Monica, CA: Rand Corporation, 2014).
4. Keith Zvoch and Joseph J. Stevens, “Summer School Effects in a Randomized Field Trial,” Early Childhood Research Quarterly 28 (2013): 24–31.
5. Cf. Benjamin D. Wright and Mark H. Stone, Making Measures (Chicago: Phaneron Press, 2004).
6. Cf. Zvoch and Stevens, “Summer School Effects in a Randomized Field Trial”; and Keith Zvoch and Joseph J. Stevens, “Identification of Summer School Effects by Comparing the In- and Out-of-School Growth Rates of Struggling Early Readers,” Elementary School Journal 115 (2015): 433–456.
7. See Jonathan Guryan, James S. Kim, and Kyung Park, “Motivation and Incentives in Education: Evidence from a Summer Reading Experiment,” NBER Working Paper Series, no. 20918 (Cambridge, MA: National Bureau of Economic Research, 2015); James S. Kim, “Summer Reading and the Ethnic Achievement Gap,” Journal of Education for Students Placed at Risk 9 (2004): 169–188; James S. Kim, “Effects of a Voluntary Summer Reading Intervention on Reading Achievement: Results from a Randomized Field Trial,” Educational Evaluation and Policy Analysis 28 (2006): 335–355; James S. Kim and Jonathan Guryan, “The Efficacy of a Voluntary Summer Book Reading Intervention for Low-Income Latino Children from Language Minority Families,” Journal of Educational Psychology 102 (2010): 20–31; and James S. Kim and Thomas G. White, “Scaffolding Voluntary Summer Reading for Children in Grades 3 to 5: An Experimental Study,” Scientific Studies of Reading 12 (2008): 1–23.
8. Kim and White, “Scaffolding.”
9. Guryan, Kim, and Park, “Motivation and Incentives.”
10. McCombs et al., Ready for Fall?
11. Catherine H. Augustine et al., Getting to Work on Summer Learning: Recommended Practices for Success (Santa Monica, CA: Rand Corporation, 2013).
12. McCombs et al., Ready for Fall?
13. Kim and Quinn, “Effects of Summer Reading.” See also, National Reading Panel, Teaching Children to Read: An Evidence-Based Assessment of the Scientific Research Literature on Reading and Its Implications for Reading Instruction (Washington, DC: National Institute of Child Health and Human Development, 2000).
14. Augustine et al., Getting to Work.
15. Susanne R. Bell and Natalie Carrillo, “Characteristics of Effective Summer Learning Programs in Practice,” New Directions for Youth Development 2007, no. 114 (2007): 45–63; Suzie Boss and Jennifer Railsback, Summer School Programs: A Look at the Research, Implications for Practice, and Program Sampler (Portland, OR: Northwest Regional Educational Laboratory, 2002); David R. Denton, Summer School: Unfulfilled Promise (Atlanta: Southern Regional Education Board, 2002); and Brenda McLaughlin and Sarah Pitcock, Building Quality in Summer Learning Programs: Approaches and Recommendations (New York: Wallace Foundation, 2009).
16. Zvoch and Stevens, “Summer School Effects in a Randomized Field Trial.”
17. Augustine et al., Getting to Work.
18. McCombs et al., Ready for Fall?
19. Geoffrey D. Borman, James Benson, and Laura T. Overman, “Families, Schools, and Summer Learning,” Elementary School Journal 106 (2005): 131–150; Geoffrey D. Borman and N. Maritza Dowling, “Longitudinal Achievement Effects of Multiyear Summer School: Evidence from the Teach Baltimore Randomized Field Trial,” Educational Evaluation and Policy Analysis 28 (2006): 25–48; Jennifer Sloan McCombs, Sheila Nataraj Kirby, and Louis T. Mariano, eds., Ending Social Promotion without Leaving Children Behind: The Case of New York City (Santa Monica, CA: Rand Corporation, 2009); and McCombs et al., Ready for Fall?
20. Cooper et al., “Making the Most of Summer School.”
21. Augustine et al., Getting to Work.
22. Charles W. Fisher et al., “Teaching Behaviors, Academic Learning Time, and Student Achievement: An Overview,” in Time to Learn: A Review of the Beginning Teacher Evaluation Study, ed. Carolyn Denham and Ann Lieberman (Sacramento: California State Commission for Teacher Preparation and Licensing, 1980), 7–32; Annegret Harnischfeger and David E. Wiley, “The Teaching-Learning Process in Elementary Schools: A Synoptic View,” Curriculum Inquiry 6 (1976): 5–43; Willis D. Hawley et al., “Good Schools: What Research Says about Improving Student Achievement,” Peabody Journal of Education 61, no. 4 (1984); Nancy Karweit, “Should We Lengthen the School Term?,” Educational Researcher 14, no. 6 (1985): 9–15; and Nancy Karweit and Robert E. Slavin, “Time-on-Task: Issues of Timing, Sampling, and Definition,” Journal of Educational Psychology 74 (1982): 844–851.
23. Doug Rohrer and Harold Pashler, “Recent Research on Human Learning Challenges Conventional Instructional Strategies,” Educational Researcher 39, no. 5 (2010): 406–412; Doug Rohrer and Kelli Taylor, “The Effects of Overlearning and Distributed Practise on the Retention of Mathematics Knowledge,” Applied Cognitive Psychology 20 (2006): 1209–1224; and Herbert J. Walberg, “Synthesis of Research on Time and Learning,” Educational Leadership 45, no. 6 (March 1988): 76–85.
24. Zvoch and Stevens, “Summer School Effects in a Randomized Field Trial.”
25. McCombs et al., Ready for Fall?