Archives: 2014 & Earlier

3 Mentoring Studies

Over the past two decades policymakers began looking at schools as a delivery model for health and human services for students and families. With the onset of high-stakes testing and the need to improve academic achievement, school administrators were open to the idea of bringing community-based resources into their school buildings.

In the mid-1990s, a study of the Big Brothers Big Sisters of America (BBBSA) mentoring program fueled hope for a seemingly easy fix to the virtually insurmountable task of making the “adequate yearly progress” required by the No Child Left Behind law. The BBBSA study found students assigned a mentor were less likely to use drugs, skip school, or get into fights, compared with students who were waiting to be paired with a mentor. Such positive behaviors are arguably linked to better academic outcomes.

Within 10 years of that report, school-based mentoring grew more than 450 percent in BBBSA programs alone. Federal funding for general school-based mentoring programs tripled by 2004, topping $50 million.

This much money directed at new programming leads to the obvious question: Just how much positive effect do mentors really have on improving academic achievement for at-risk students?

Three studies published in 2009 reported conflicting results, leaving policymakers and grant funders with the onerous task of figuring out what to do next.

The current evaluation chose to take a closer look at each of the three studies in an effort to explain the disparate results. A meta-analysis is a statistical method for combining results from different research studies. In the case of school-based mentoring, the authors wanted to see whether the different results came as a matter of chance, proving that mentoring does not statistically affect student achievement, or whether the results were inconsistent because the actual mentoring programs were simply different from one another and the results from the different studies just can’t be compared.

 

Big Brothers Big Sisters of America (BBBSA) 2007 Study

Sample size: 1,139

Schools: 41 elementary, 27 middle, and three high schools located in 10 rural and urban areas across the country.

Grades: Fourth- and fifth-graders, 61percent; sixth- through eighth-graders, 34 percent; ninth-graders, 6 percent.

Student Race: Virtually even mix of African-American, Hispanic, white, and multi-racial.

Mentor Age: Half of all mentors were high school students. The rest were college students or older.

Mentor Race: More than three-quarters were white.

The national BBBSA office requires affiliates to abide by a set of standards that make the delivery of BBBSA services uniform across the country. Even so, BBBSA mentors are not required to attend initial training, though they must go to a monthly mentor support meeting. On average, mentors meet with their matches for 45 minutes weekly over 5.3 months, for a total of 17 hours.

 

Communities in Schools of San Antonio (CIS) 2008 Study

Sample size: 525.

Schools: Seven elementary, five middle, and seven high schools based in San Antonio, Texas.

Grades: Fifth-graders, 19 percent; sixth- through eighth-graders, 37 percent; ninth- through 12th-graders, 44 percent.

Student Race: 78 percent Hispanic.

Mentor Age: Almost three-quarters were college students. The rest were adults.

Mentor Race: Half of all mentors were Hispanic. A third of them were white.

CIS mentors must attend a one-hour training session before they begin meeting with their matches. Additional training is available, but mentors are not required to attend monthly support meetings. Mentors must meet with their mentees for one hour a week over the course of a school year. In reality, mentors averaged 10 hours of time with their matches.

 

U. S. Department of Education Student Mentoring Program (SMP) 2009 Study

Sample size: 2,360.

Schools: 103 elementary and middle schools located across the country.

Grades: Fourth- and fifth-graders, 42 percent; sixth- through eighth-graders, 44 percent; ninth- through 12th-graders, 14 percent. 

Student Race: Slightly more African-American than Hispanic or white.

Mentor Age: More than half were adults. A quarter of the mentors were college students and the rest were high school students.

Mentor Race: Two-thirds of all mentors were white. A little more than one-quarter were African-American.

As federal grant recipients, SMP programs must provide initial mentor training as well as monthly mentor support, but individual programs determine what training is provided.  On average, mentors met 3.4 hours each time with their mentee over 5.8 months for a total of 23 hours.

 

Comparing findings

One can think of evaluating statistical significance as building a bridge to connect two points. Some studies, such as BBBSA’s, use a wide, yet stable, foundation. If the statistical connection exists, readers can confidently cross over from one point of information to the next – this structure is akin to a one-lane wooden bridge. Each plank is a verified and significant reason for building that connection. Yet, the bridge has some wiggle room, with some planks that lead from one point to the other being weaker than others, for various reasons.

Other studies, like the CIS one, tighten their criteria. They want to minimize the risk of unrelated variables justifying their results. The bridge connecting two sides narrows to a concrete walkway; they stand on more stable ground with less wiggle room, so to speak.

Finally, such studies as the SMP one use the highest level of statistical standard in an effort to make a direct and confident connection. The pathway is something like a steel girder. It’s the most solid connection, but at the same time it’s narrow, perhaps too narrow, cutting off other connectors that could span the crossing. In the case of research studies, using this highest level of confidence increases the likelihood of failing to detect a significant reason for an effect.

Because each of the original studies used a different statistical model to test for significance, the meta-analysis re-ran analyses using the same statistical criteria for all the studies.

Using the most stringent criteria level – that steel girder connection – SMP mentoring programs demonstrate no significant effect on student achievement or positive personal outcomes, such as decreasing absenteeism or mentees thinking about their future goals. BBBSA mentees, on the other hand, produce a higher quality of classwork and complete more class assignments than students without a mentor.

Using the more generous statistical model – the wooden bridge connection – those same mentees from SMP benefit significantly in five outcome areas, including reduced truancy. The same statistical model yields significant benefits in four outcome areas for CIS mentees and 10 outcome areas for BBBSA participants.

In the formal meta-analysis, where the results from all three studies were combined and analyzed together, six of 19 outcome areas showed evidence of having a positive effect on mentees. Specifically, mentor relationships help reduce truancy, maintain a supportive non-family-member adult relationship, increase levels of confidence in academic abilities and peer support, and lower school-related misconduct and rates of absenteeism.

Specific academic achievement gains in math or reading were not affected by the presence of a mentor. Nor was there a significant difference in illegal substance use for mentees.

In other words, when examined collectively through meta-analysis, school-based mentoring does not directly affect academic achievement. However, its greatest effects – on truancy reduction and other pro-social behaviors – can be suggestively linked to eventual improved academic performance.

At a time when budgets continue to be cut, the authors stress the importance of making good, research-based policy decisions with an eye on demographics and program differences. Not all school-based mentoring programs are alike. Programs serving middle-school students of a variety of racial backgrounds in 100 cities across the country cannot necessarily be compared with programs serving predominantly Hispanic high school students in south Texas. Nor can fair comparisons be made between programs that mainly utilize high school versus adult mentors.

When organizations spend $1,000 per student to facilitate school-based mentoring, advocates and policymakers want to draw cost-effective conclusions from research studies. In addition to being aware of program delivery differences, the authors call for a longitudinal study to examine the effects of multi-year mentoring. Modest effects are not to be unexpected when the average length of mentoring is less than six months, as it was in each of the original studies.

Although the findings of these modest studies became the rationale for suspending funding for school-based mentoring, the more appropriate response would be to take the results as preliminary support for emerging best practices.

 

Name of the study: Review of Three Recent Randomized Trials of School-Based Mentoring: Making Sense of Mixed Findings.
Authors:
Marc E. Wheeler, Thomas E. Keller, and David L. DuBois.
Study Focus
: Examining three recent large-scale studies on the effectiveness of school-based mentoring, the authors analyze the organizational differences among the nationwide mentoring programs and the statistical methods used by each study to understand why some mentoring programs are found to be effective and others are not.
Type of Evaluation
: Using meta-analysis, a statistical method that combines results of different research studies, the authors evaluate which outcomes school-based mentoring affects.
Scope of Program
: The three original research studies examine mentoring programs in 193 schools, serving 4,024 students across the country.
Time Period
: The original studies were published between 2007 and 2009.
Funded By
: Partial support provided by the William T. Grant Foundation Distinguished Fellows Program.
Availability
: Download a 26-page PDF of this report or view it online at http://www.srcd.org/index.php?option=com_content&task=view&id=232&Itemid=1._ 

Alessa Giampaolo is an educational consultant and curriculum developer. She can be reached at info@youthtoday.org.

Comments
To Top
Skip to content