Guest Opinion Essay

A Tale of Two Studies

The recently released final report on Mathematica’s study of after-school programs (“When Schools Stay Open Late: The National Evaluation of the 21st Century Community Learning Centers Program”) drew considerably less attention than did its initial report two years earlier, perhaps because it drew essentially the same conclusion: Weak programs produce weak results.
Meanwhile, a second national evaluation is beginning to tell a very different story. With support from the Charles Stewart Mott Foundation, the Wisconsin Center for Education Research and Policy Studies Associates are conducting a Study of Promising After-School Programs, designed to identify practices that enhance academic, social, emotional and physical development. They also hope to assess whether disadvantaged youth ages 8 to 14 who participate in the targeted programs achieve significantly greater developmental and learning gains than disadvantaged youth who are denied such opportunities. While their results are not yet final, three interim reports (from September 2003, February 2004 and March 2005) offer compelling lessons about what constitutes a high-quality after-school experience.

A side-by-side review of these reports and the Mathematica study reveals sharp contrasts in the programs that were evaluated.
Mathematica’s research describes activities that, for the most part, have limited developmental goals and employ traditional academic approaches. While all of the elementary-level programs in the study sought to improve academic achievement, only 43 percent listed social development as a primary goal. And despite their stated intent to promote academic achievement, many programs were described as offering little real assistance. For example, reporting on “homework help” sessions, the Mathematica team noted, “Most centers neither monitored what homework students should have been doing nor ensured that students completed the homework their teachers had assigned.”

The description of other academic offerings in these programs includes: direct instruction; educational technology packages to reinforce basic skills; practice drills, worksheets and games to improve academic skills; preparation for standardized tests; and enrichment activities with an academic focus, such as science labs, Spanish, algebra clubs, robotics, technology and computer lab. Only this last set of items (about which we learn very little in the study) sounded the least bit interesting, innovative or different from what probably occurs during the regular school day.

Yet another indicator of the programs’ poor quality is the lack of coordination between the after-school and the school-day programs: Only 31 percent of after-school staff and 23 percent of teachers reported sharing information about homework assignments.

While the Mathematica evaluation did report a few positive findings – for example, that program participants felt safer after school than did nonparticipants – overall, it presents a fairly negative picture of program quality and results.

In contrast, the programs described in the Wisconsin-Policy Studies report convey a sense of excitement and a deep understanding of how young people engage with learning and development. Those programs include a middle school computer-based activity that blends writing, Web design and research; an elementary-level leadership council; a middle school chess club; cooperative story-telling that combines sculpture, writing, computer skills and graphic arts; writing and producing a play with the help of a professional actor; and outdoor cooperative math activities.

Why settle for the weak interventions described by the Mathematica team? Bring on more of the challenging approaches described in the Mott-funded study. Here are some ideas about how:

• Use training and technical assistance dollars within the federal 21st Century Community Learning Centers program to showcase promising practices and build provider capacity to deliver high-quality programs. The authorizing legislation allows states to spend 2 to 3 percent of their federal allocation on capacity-building.
• Create incentives for programs to use and apply high-quality assessment inventories, such as the Youth Program Quality Assessment tools by the High/Scope Educational Research Foundation (under “education programs/youth development” at www.highscope.org), or the National Afterschool Association’s quality assessment protocols (www.naaweb.org/accreditation. htm). Public and private funding sources could give competitive priority to applicants that verify program quality through objective and well-accepted rubrics. (See “Evaluation Spotlight,” page 34, for a description of an assessment tool adopted by New York State.)
• Synthesize the findings of these and other studies, focusing not just on program features but on underlying theories of change and expected outcomes. The Mott study offers a rich theory of change that is rooted in core developmental needs of children and youth. The 2002 revisions to the 21st Century CLC program, which emphasized academic enrichment and youth development, and the federally supported technical assistance contract with the Southwest Education Development Lab, have set the stage for a serious effort at much-needed program strengthening.

Jane Quinn is assistant executive director for community schools at the Children’s Aid Society in New York City. Contact: janeq@childrensaidsociety.org.

Comments
To Top
Skip to content