The Study that Ignited (or Diluted) Mentoring

Print More

You’d think an organization that was serving tens of thousands of kids and had become a household name wouldn’t feel “vulnerable.” But that’s how Tom McKenna felt about Big Brothers Big Sisters of America when he headed it in the late 1980s.

The concept of youth mentoring was spreading, and while that might seem like a good thing for the nation’s pre-eminent mentoring organization, it posed a problem: With BBBSA costing about $1,000 a year per youth, policymakers and funders were increasingly demanding evidence of its impact. Meanwhile, more and more groups were doing their own form of mentoring at less cost.

“The thinking was that it [mentoring] could be done cheaper,” McKenna says. “We were in a vulnerable position.”

So McKenna did something that his successor Judy Vrendenburg, calls “courageous” and researchers call unprecedented: He asked a research firm to conduct a thorough, control-group study to see if his youth program had any measurable impact on kids.

The report by Public/Private Ventures (P/PV) became what might be the most influential, most cited and most misused study in youth work. “Making a Difference: An Impact Study of Big Brothers Big Sisters” catapulted BBBSA to new heights and is cited as a key factor in the explosion of mentoring throughout the United States over the past decade.

But mentoring researchers lament that the study has been used in ways that actually harm youth work, fueling a rush to expand mentoring that produced a flood of cheap copycats whose impact is unclear. In the popular culture of the youth work field, the study has morphed into a generic cornerstone to support anything called “mentoring,” with countless organizations citing it as proof that mentoring works.

Such misapplication is so rampant that when P/PV reissued the study in 2000, the forward by its president, Gary Walker, said one reason was “to remind all of us that this study did not show that mentoring, as a generic idea, is effective.”

That didn’t stop Grand Area Mentoring in Moab, Utah, from putting out a brochure that uses the study’s findings to say, “Young people who are mentored are 27 percent less likely to begin drinking alcohol, 52 percent less likely to skip school, 33 percent less likely to hit someone” – just the kind of generalization that the study doesn’t make.

“What’s sort of frustrating is that people have painted us into a corner,” says Jean Baldwin Grossman, who wrote the report with Joseph P. Tierney. “We were trying to be cautious. We said this isn’t the silver bullet.”

Which is why it’s worth reviewing what the study says and how it has affected youth work.

‘We had to do this’

If you think it’s difficult to find scientific evidence of the impact of youth work now, consider yourself lucky it’s not still the 1980s. There were very few studies about the impact of human services, says Walker of P/PV, and those that existed were either not thorough or not flattering – prompting some to question the value of funding them.

“If you really looked honestly at the … results of good evaluations of social programs,” he says, “you would walk away saying these things may be the right thing to do for moral reasons, but in fact they don’t really work.”

Some studies, however, indicated that a key to effective youth work might lie not in the activities per se, but in good relationships with adults. Walker says P/PV wanted to test the theory.

BBBSA, meanwhile, was in danger of becoming the Rolex of mentoring, as countless organizations were instituting various forms of mentoring that were less intense in recruitment, training and support, and less expensive.

“People thought mentoring was tutoring and all kind of other things,” McKenna recalls. “They looked at Big Brothers Big Sisters and thought, ‘Boy, your model’s expensive. … Why not just match kids up?’ There wasn’t a lot of respect for the care we took” in creating and maintaining good relationships between youths and adults.

Like a lot of youth-serving organizations, BBBSA was certain that its approach worked, but had no research to back it up. “I felt we had to do this,” McKenna says of the study.

So in 1998, BBBSA opened its doors to P/PV for an unprecedented examination. The project was funded by Pew Charitable Trusts, the Lilly Endowment, The Commonwealth Fund and an anonymous donor.

“Nobody knew just what would come out” of the study, says Joyce Corlett, who, as director of training, worked extensively with P/PV and the agencies during the project. “Anecdotally, everybody felt secure. But there have been plenty of studies that were bland or negative” when organizations expected to see positive results.

The Findings

P/PV conducted four studies; the first three focused on processes, such as uniformity of the model among the affiliates, volunteer recruitment and the factors that made the “big/little” matches successful.

The initial studies prompted BBBSA to correct a few procedural problems and make sure the affiliates (now numbering 427) were uniformly following practices that were found to work well. For example, it appeared that BBBSA’s tough screening process for adult volunteers had gone too far. “Some of our case workers had the idea that they wanted to make the applicants jump through as many hoops as possible,” McKenna says.

One result: Applicants often waited so long for a match that they bowed out. “We were losing people,” Corlett says.

The agency adopted a more uniform approach that was careful, but faster and less cumbersome for the volunteers, McKenna says.

Then came the fourth study, about the impact of BBBSA on youth.

For that, P/PV tracked 959 youth, ages 10 to 16, during 1992 and 1993 at BBBSA affiliates in Phoenix, Ariz.; Wichita, Kan.; Minneapolis, Minn.; Rochester, N.Y.; Columbus, Ohio; Houston and San Antonio, Texas; and Philadelphia. About half the youths were in a treatment group and were matched with “bigs,” while the rest were in a control group and assigned to BBBSA waiting lists.

Researchers compared the two groups after 18 months, using reports from the youths, their parents and BBBSA case managers.

The findings, published in 1995, gave BBBSA what youth agencies everywhere want: affirmation that it helps kids.

The major findings – and the ones most often repeated by other organizations – were that, compared with the control group, youths who got BBBSA matches were 46 percent less likely to begin using illegal drugs, 27 percent less likely to begin using alcohol and 32 percent less likely to have hit someone in the past year, and skipped 52 percent fewer days of school.

As these findings have been spread about, a few details about the study have been lost:

  • Most of the impacts were relatively “small,” or were based on small sample sizes that could easily produce big percentage changes, Grossman notes.
  • The study found no statistically significant impact for many behaviors, including stealing from stores, cheating on tests, smoking and participating in recreation programs.
  • Within the categories measured, there were often significant differences between races and genders.
  • As a group, the youths with BBBSA matches showed increased negative behaviors, such as drinking and drug use, although their increase was less than that of the control group.

Nevertheless, Grossman notes that with the youth field in need of some evidence that its programs work, “people went head over heels.”

Inspiring Growth

McKenna says the impact study was “a big turning point for us” at BBBSA. “It was a terrific boost to the bigs, to the volunteers,” as well as to the professional staff.

The validation of BBBSA’s approach infused the organization with ambition to grow. The phrase “going from success to significance” became “almost a mantra,” Corlett says. “We began saying that if it’s having these robust effects … what can we do to double our size?”

Over the past decade, the national office and local programs have used the study to recruit volunteers and raise money. References to the study are sprinkled throughout literature and websites produced by BBBS organizations in the United States and beyond.

“I have often used the statistics when applying for funding, and when presenting for volunteer recruitment purposes,” Lorraine Levitt, executive director of Big Brothers Big Sisters of Williams Lake, in Canada, said via e-mail.

Those promotional efforts got a big boost when the study landed the organization on what has become one of the most respected “best practices” lists in the country – the “Blueprints” list of evidence-based programs, created in 1996 by the Center for the Study and Prevention of Violence at the University of Colorado. BBBSA is the only national youth-serving organization among the 11 programs that made the list of “models.”

The study and the ambition it ignited are major reasons BBBSA grew from 75,000 matches at the time of the study to more than 230,000 now. It has set a goal of 1 million by 2010.

As for funding, the BBBSA says that the national office and its affiliates took in $89.5 million in revenue in 1995. Last year, they took in $252.4 million. (The totals include reports from most, but not all, of the affiliates.)

The federal government, which reduced funding for social programs in recent years, has increased its commitment to BBBSA. The Adam Walsh Child Protection and Safety Act, signed into law in July, authorizes $58.5 million for the organization over five years.

But the impacts went beyond BBBSA. Grossman believes the study, and the subsequent reaction, “prevented the mentoring movement from dying.”

An Impact Study’s Impact

Mentoring researcher Jean E. Rhodes of the University of Massachusetts at Boston calls the study “the watershed report” in the growth of mentoring over the past decade. As a result of the report, she told a forum in Washington in September, “mentoring won the hearts and minds” of the public and policymakers.

“The findings provided scientific justification for policymakers and practitioners from across the political spectrum to promote mentoring and, more than a decade later, continue to undergird the new generation of programs,” Rhodes and researcher David L. DuBois wrote in a recent issue of Social Policy Report.

“Findings were cited on the floor of the U.S. Senate, and in research, news, and opinion pieces.”

Walker of P/PV says that at a time when it was impossible to find scientific evidence of the impact of most social work, “this study stood out in that it did have positive things to say … about the effects of intervention on youth.”

Even better, it was a type of intervention that appealed to the political center. “This was about relationships,” he says. “It wasn’t about putting together huge comprehensive services that make a lot of people nervous. It was an easy sell.”

And it reinforced what people already believed was common sense – that youths benefit from positive relationships with adults.

The Justice Department used the study to modify its Juvenile Mentoring Program (JUMP). “It influenced the design in that it identified some of the key components that you need to put into mentoring,” said Doug Dodge, former director of the Discretionary Programs Division of the Office of Juvenile Justice and Delinquency Prevention (OJJDP).

For example, an OJJDP Juvenile Justice Bulletin in 1997 said the study prompted the agency to expand its requirements for mentor training and support.

The study also helped to make mentoring a key part of the Presidents’ Summit in 1997, from which America’s Promise – The Alliance for Youth was born.

When the National Mentoring Center opened in 1998, it worked with BBBSA and P/PV “to translate the findings of that study ... into useable program development materials” for mentoring programs, Michael Garringer, resource adviser at the center, says via e-mail. With the center having distributed more than 250,000 copies of the materials, Garringer says, “I cannot imagine anything having more influence over the mentoring field today than the original study and the many useful technical assistance materials” that the center and others produced as a result.

Further illustrating the study’s influence, Rhodes and DuBois write that their Internet searches for the study’s title turned up “about 70,000 hits.” Many of those citations, however, also show that the BBBSA and researchers couldn’t control how the study was used.

Does it Say That?

Researchers understand that people will use evaluation data for their own purposes. Grossman says that’s why she and Tierney, her co-author, “tried very carefully not to use inflammatory language” in writing their report.

Nevertheless, BBBSA and the researchers quickly saw the study used to make claims the report doesn’t support. “People who didn’t follow our model … were pointing to the research like it justified what they were doing,” McKenna says.

Consider Grand Area Mentoring, in Utah, which bills itself as an “academic mentoring” program, is run under the local school district and takes place in the schools. The program’s website introduces findings from “Making a Difference” by suggesting that such benefits can accrue from “just one hour per week” of mentoring.

BBBSA stresses that it is not academically focused. The study did not include school-based mentoring, and it says the bigs averaged nine to 12 hours a month with each youth.

The director of the Utah program said he didn’t read the study, but got the summary he used from an education website.

Also common is this approach by St. Vincent Health, a health care system in Indiana: “Mentoring Works!” declares a section of its website that recruits mentors. As proof, it applies figures from the BBBSA study to mentoring in general. Similar generalizations are made by the Corporation for National and Community Service, the National Center for School Engagement and the Ad Council.

“I have seen lots and lots of people say we proved mentoring works,” Grossman says. “We proved mentoring done in a certain way works.”

She knows that’s par for the course in the way research data are used. “The one thing that sort of bothered me,” she says, “is when the results were used to support not one-on-one mentoring, not face-to-face mentoring,” but such cousins as e-mentoring, group mentoring and school-based mentoring. Those approaches might be helpful, she says, but she hasn’t produced any evaluations that say so.

To be sure, human services agencies routinely use studies of other programs in order to establish that their work is based on practices that have been proven effective. After all, not many agencies can afford random-assignment, control-group studies. And no one from BBBSA or P/PV knocks programs that can’t afford to do the kind of mentoring that BBBSA does.

In fact, they and the National Mentoring Center have helped countless agencies use the findings to adopt the effective practices.

But Rhodes and DuBois contend that the report inadvertently added to a rush to grow mentoring on such a scale that it can’t be widely sustained in the form that was proven effective.

“Modest findings from the evaluation of an intensive community-based approach to mentoring helped to galvanize a movement and stimulate aggressive growth goals,” they write in Social Policy Report. “These goals necessitated that mentoring be delivered more efficiently, which, in turn, changed the intervention to something that bears decreasing resemblance to its inspiration.”

When P/PV reissued the study in 2000, Walker’s forward noted, “Neither warm-hearted volunteers nor well-intended professionals in schools can make it [mentoring] uniformly effective without tending to the lessons that Big Brothers Big Sisters has learned.”

Those lessons, according to the study, include extensive volunteer screening, training and ongoing support, and frequent face-to-face time between bigs and littles.

Vrendenburg says organizations should note the importance of “making the connection correctly” between a youth and mentor, and of providing consistent professional support for “the development of a relationship between these absolute strangers that come from these very different worlds.”

A lesson from the early P/PV reports, McKenna says, is for the mentors to focus on youth development, as opposed to going into the relationship trying to achieve certain outcomes, such as specific academic improvements.

The good news, they stress, is that mentoring, when done well, produces measurable results. The tougher news is that, as Walker wrote in his forward, “mentoring’s easy attractiveness belies the effort and structure that makes it work.”


Big Brothers Big Sisters of America
(215) 567-7000,

Jean Baldwin Grossman
Public/Private Ventures
(215) 557-4400,

The study is at