Denver, Colo. — When Robert Barnoski saw the first outcome data from his evaluation of Functional Family Therapy in early 2002, he says, “I nearly fell out of my chair.”
As a senior researcher at the Washington State Institute for Public Policy, Barnoski had helped to create a 1997 state law offering funds for local courts to replicate Functional Family Therapy (FFT) and other “evidence-based” approaches to combating teen delinquency. He then tracked the outcomes, comparing FFT participants in 14 courts with youth randomly assigned to the courts’ usual services.
At first glance, the results looked disappointing. Teens in FFT had been convicted of new felony offenses at almost the same rate as those in the control group.
But as he analyzed the outcomes more carefully, Barnoski grew elated. “I’ve never seen data line up so perfectly in a real-world trial,” he says.
Barnoski’s team had measured “therapist adherence” – how well the therapists trained to deliver FFT followed the model’s regimented treatment approach for chronically delinquent teens. The correlations were striking: When FFT was delivered by therapists rated by the national FFT office as “competent” or “highly competent,” the program reduced felony recidivism rates by 30 percent and saved taxpayers about $7.50 for every dollar of program cost.
The problem was, 17 of the 36 therapists were deemed “borderline” or “not competent.” Youth assigned to those therapists performed even worse than youth receiving the usual services.
The story is emblematic. With the growth of “evidence-based programming” in youth development, approaches such as FFT, Multisystemic Therapy (MST) and Multidimensional Treatment Foster Care have been replicated in hundreds of mental health, juvenile justice and child welfare agencies nationwide. Those three models serve nearly 40,000 youth per year.
But while these and other approaches have substantially reduced delinquency, substance abuse and other problem behaviors in randomized control trials, the struggle continues to answer this question: Can strategies that yield impressive results in the rarefied conditions of scientific trials sustain their power when operated by run-of-the-mill professionals beyond the control of the model developers? (See Evaluation Spotlight.)
Stung by studies like Barnoski’s, showing that evidence-based programs lose their effectiveness when they’re not implemented properly, program developers are devising elaborate protocols to guide the replication of their models. They are also increasing training and technical support and researching ways to help providers fully implement those models.
“If we’re going to be committed to empirically based practice,” says Sonja Schoenwald, a leading MST researcher, “we should also be committed to an empirically based process for how to take evidence-based programs to scale.”
Even before Barnoski’s evaluation, FFT researchers were developing an elaborate Web-based system to monitor therapists’ adherence to the model. MST, meanwhile, requires a weekly conference call for every MST therapy team in the nation with a senior expert, either based in MST’s national office or a certified satellite office.
Such efforts are necessary, says Tom Sexton of the University of Indiana, a leading scholar of FFT, for a simple reason: “A good program badly delivered is just a bad program.”
To be sure, some approaches touted as models have been difficult to replicate or lacked sufficient evidence. But for these, which are among the best-tested models in the nation, evidence is mounting that strict fidelity to the model makes a crucial difference.
A Revolution in Prevention Science
Until the 1980s, prevention scholars could not point to a single delinquency prevention or intervention program model with solid scientific evidence of effectiveness. Since then, dozens of randomized, controlled experiments have returned positive results in combating youth delinquency and substance abuse.
“We’ve had a breakthrough,” says Delbert Elliott, director of the Center for the Study of Violence Prevention at the University of Colorado.
For example, MST, which provides four to six months of intensive family-based treatment, has produced statistically significant benefits in 14 clinical trials since 1986 for youth with serious delinquency, substance abuse and mental health problems. In those trials, it reduced arrests by 25 to 70 percent and future placement in juvenile corrections or mental health institutions by 47 to 64 percent.
Multidimensional Treatment Foster Care (MTFC), which temporarily places troubled youth with specially trained foster families while counseling their parents, has also lowered juvenile recidivism dramatically in clinical trials. In one trial, repeat juvenile offenders placed in MTFC were seven times more likely to avoid future arrests than youth placed in group homes, and they spent less than half as much time incarcerated.
FFT, which employs cutting-edge techniques in a more conventional counseling format, has produced similarly impressive results in a series of trials dating back to 1973.
Likewise, several school-based prevention models have reduced future drug abuse and delinquency in clinical studies, while some early intervention programs – like visits by nurses to high-risk mothers and behavioral interventions for preschoolers – have been shown to reduce problem behavior during adolescence and beyond.
These successes generated enthusiasm in academic circles, but initially made a far smaller splash among policymakers and youth-serving professionals.
Practitioners often resist “canned,”one-size-fits-all approaches, says Elliott. “There’s a healthy skepticism. … [Practitioners] think they know their kids better than some researchers.”
Besides, few therapists or youth workers are accustomed to following rigid protocols or documenting their work scrupulously, as evidence-based programs commonly require.
To help overcome this resistance, in 1996 Elliott cobbled together funding from the U.S. Centers for Disease Control and Prevention and from the states of Colorado and Pennsylvania to launch Blueprints for Violence Prevention. He assembled an expert advisory board and began reviewing the scientific evidence behind scores of models to identify those proven to reduce delinquency and substance abuse.
Elliott set the bar high: Only replicable programs repeatedly demonstrating success in random trials were to be labeled “blueprint” models. “My intent wasn’t that programs that don’t meet this standard shouldn’t be funded,” Elliott says. “I do think before you spend $100 million on replicating a program, you really ought to know it works.”
In 1998, Blueprints published detailed reports on the first 10 models to meet its standards, including FFT, MST and MTFC. Since then, the Blueprints reports have generated enormous attention for the selected programs and for evidence-based programming generally. “People can trust it,” says FFT’s Sexton.
Federal agencies have jumped in with their own lists of evidence-based programs, including the Department of Health and Human Services, the Department of Education, the Office of Juvenile Justice and Delinquency Prevention, the Substance Abuse and Mental Health Services Administration, the National Institute on Drug Abuse and the U.S. Surgeon General.
All of this attention has gradually created widespread interest in evidence-based programs and jump-started substantial replication efforts. Several states have set aside funds to support evidence-based anti-delinquency programs. And interest among professionals and policymakers continues to grow. In March, a national Blueprints conference in Denver attracted over 1,000 participants – far more than organizers anticipated, even though they cut off registration two weeks early.
The Science of Replication
For the research community, the push to replicate evidence-based programs has presented both a vindication and a vexing challenge. Until recently, prevention scientists were novices at helping front-line government and nonprofit service providers implement their complex programs effectively in less controlled circumstances.
Sexton of FFT recalls a call from Elliott in the early 1990s: “He asked me whether we were ready to take the FFT model to scale. I had no idea what he was talking about.”
Since then, Sexton and FFT’s original developer, James Alexander, have created a nonprofit (FFT) to provide training and to work with local agencies on replicating FFT. To promote fidelity to the treatment model, FFT requires therapists and supervisors in every program to promptly enter progress notes and other case information into a Web-based monitoring system, and to survey client families and collect input data on their experiences with FFT.
Local programs receive quarterly reports from the central FFT program office, and therapists can go online at any time to see “weekly adherence” and “global adherence” ratings. “Unless it’s really in the hands of the therapists, it’s not going to improve practice,” Sexton says.
MST, through its own replication subsidiary, has gone to even greater lengths to ensure treatment fidelity in replication sites. Quality assurance procedures include a five-day pre-service training for every therapist, quarterly booster trainings, weekly meetings (at least) with an on-site supervisor trained in MST’s supervisory protocol, a weekly conference call for the entire treatment team with a national MST consultant, and periodic feedback from clients through a Web-based survey feedback system.
In addition, MST scholar Schoenwald has launched an ambitious research agenda to determine the critical factors for effective replication. In an ongoing 45-site replication study involving 452 therapists and almost 2,000 youth, Schoenwald has found that therapists’ adherence to the MST model has a powerful effect on youths’ future arrest rates. An increase in therapist adherence of one unit (10 points on a 100-point scale) predicted a 38 percent reduction in future arrest rates.
The most important factor in adherence to the model, Schoenwald found, is the effectiveness of the on-site supervisors and the national consultants who oversee their work. One remarkable finding: A one-unit increase in a supervisor’s adherence to MST’s structure and process predicts a 53 percent reduction in the likelihood of criminal charges being filed against a youth after treatment.
Trying to Get it Right
“We’ve made tremendous progress,” Elliott says. “I think that today a lot of people use ‘evidence-based practice’ as part of their everyday language, and there is increasing recognition that ensuring high-quality implementation is a critical challenge.”
Throughout the two-day Blueprints conference, the terms “replication,” “fidelity” and “adherence” were continually on the lips of both speakers and attendees.
Out in the field, replicating evidence-based programs remains a daunting challenge. Often, funding is scarce, support from model developers is thin, and the extra demands on administrators and staff are considerable. (See “Fidelity vs. Reality.”)
Nevertheless, evidence is mounting that when proper support is provided, research-based programs can be replicated with high fidelity in mainstream youth development agencies. In a 42-site national replication project involving eight of the Blueprints violence prevention models, three-fourths of the sites implemented all of the core components of their chosen program models.
All but one site achieved at least 81 percent adherence – a far better result than previous replication studies. (See “Lessons in Replication.”)
Back in Washington, where competent delivery of evidence-based models has become a focal point, Barnoski says, “We’re confident that the overall quality and competence of our programs is going up.”
Dick Mendel is a freelance writer based in Baltimore. Contact: RAMendel@aol.com.
Fidelity vs. Reality
Because they involve complicated and often unconventional methods and require cooperation from a variety of unfamiliar partners, implementing evidence-based programs correctly can be difficult. Consider these tales:
Getting help: In 2004, the Peter and Elizabeth C. Tower Foundation (based outside Buffalo, N.Y.) began offering grants for agencies to replicate evidence-based substance abuse prevention and mental health programs for children. But when the agencies tried to contact developers of the model programs, “it was quite frustrating,” recalls foundation Program Officer Michael Kustreba.
“We couldn’t access the program developers or their staffs, and once we did finally get hold of them, we had one heck of a time getting consistent and reliable information” about when training would be provided, how much it would cost and what ongoing implementation support would be offered, he says.
Children’s Friend, a mental health agency in Massachusetts, used a Tower Foundation grant to replicate Brief Strategic Family Therapy (BSFT), a substance abuse treatment approach developed in Miami. After struggling to reach the BSFT leaders,
Children’s Friend arranged training for its therapists and contracted for six months of telephone supervision, including reviews of videotaped therapy sessions, says Diane Young, director of the agency’s office in Salem, Mass. But as of March, with the six months nearly elapsed, two therapists still weren’t certified and no one on staff had been trained to review videotapes.
“Sustaining our model will depend on training to be certified at the supervisor level,” Young says.
Qualified Therapists: Halvorson House, in rural Farmington, N.M., has operated a small MST program since January 2005. The state’s Medicaid program reimburses Halverson for its MST work. However, state rules say that for the agency to be reimbursed, most of its MST therapists must hold master’s degrees. The MST treatment manual requires only a bachelor’s degree.
“It’s really hard to find master’s-level therapists around here,” says program supervisor Kristen Hale.
Coordinating Partners: With funding from the Florida Department of Juvenile Justice, the Henry and Rilla White Foundation operates MST programs in three Florida counties. The programs are struggling in two of the counties – Duval (Jacksonville) and Escambia (Tallahassee) – because local judges and probation officers frequently pull youth out of MST for minor violations such as occasional truancy or a single dirty drug screen, even when their overall behavior is improving. “They need to allow the MST program to do what it does,” says Patsy Schossler, who oversees the White Foundation’s MST programs.
MST’s results are far better in the third county, Alachua (Gainesville), where judges don’t review cases during the course of MST treatment and probation officers work more cooperatively with MST staff. The dropout rate for Alachua County’s MST program has been just 12 percent this year (vs. 27 percent and 24 percent in Escambia and Duval, respectively) Schossler reports. In Alachua, 71 percent of youth who’ve completed the program have avoided re-arrest, compared with 63 percent in Escambia County and 50 percent in Duval County.
Staff Motivation: “If you’ve got therapists who want to work in a more traditional, autonomous mode” – rather than following the highly structured and intensively supervised MST approach – “they’re not going to last very long in MST,” says Keller Strother, president of MST Services, the replication arm of MST. “We’ve had numerous occasions where we’ve had complete staff turnover within six months of starting a new MST program.”
Doug Kopp, director of FFT’s national replication office, recalls that on several occasions when FFT staff visited a new program site to deliver training, “the therapists wouldn’t know what they were there for.” Now FFT visits potential replication sites before agreeing to provide training and other support. “Sites need to be committed to the mission of evidence-based family therapy,” he says.
Money: Delivering evidence-based programs is difficult when funders don’t reimburse providers for all the costs. For instance, MST therapists don’t just deliver counseling sessions. They also drive to and from clients’ homes, interact with teachers and probation officers, and spend hours each week documenting their efforts and participating in case reviews and supervisory sessions. These costs often are not reimbursable.
“If you want people to do programs that work, you need to pay for them to provide the services,” says Kustreba of the Tower Foundation. “You can’t pay for the 50-minute hour when it takes 90 minutes.”
Some states have created reimbursement codes to fully fund practices such as FFT and MST through Medicaid or from mental health and children’s services budgets. But in much of the country, agencies still face the “pretzel problem” when they attempt to deliver evidence-based models, says Karen Blase, a replication expert at the National Implementation Research Network in Tampa, Fla.: “You have to twist your programs around all of these different funding streams.”
Lessons in Replication
“I don’t care how good the model is,” says Delbert Elliott, director of the Center for the Study of Violence Prevention (CSVP). “If you don’t implement it well, you’re not going to get the effects.”
From 1999 to 2002, CSVP’s Blueprints for Violence Prevention project conducted a 42-site experiment to replicate eight evidence-based violence prevention models. While previous research showed that model programs are typically not implemented completely or well, sites in this study achieved very high rates of program adherence – at least 80 percent in virtually every case. Although no outcome evaluation was conducted, the high adherence rates should help the replication programs approach the strong impacts these models achieved in clinical trials.
In a process evaluation of the replication initiative, CSVP scholar Sharon Mihalic and her colleagues identified these factors as critical to effective replication:
Dissemination Capacity: Program developers must support dissemination with manuals and other published materials, training, ongoing technical assistance, and data management systems to measure adherence to the program model.
Site Selection: The biggest reasons for failure were that sites lacked a strong champion for the replication effort, motivated and qualified staff, administrative support, organizational capacity and stability, and/or credibility with funders and key partners in their communities.
Training: Treatment and administrative staffers must be hired in advance and attend intensive pre-service training with the model developer and implement the program soon after they complete training.
Ongoing Technical Assistance: To maintain fidelity, replication sites need access to the model developer for follow-up training, advice in addressing unexpected problems, and general oversight and assistance.
Attention to Fidelity: With Blueprints staff maintaining close contact with replication sites and continually emphasizing the importance of fidelity during site visits and telephone consultations, most replication sites fully implemented the models.
Resources
Delbert Elliott, Director Doug Kopp, President Keller Strother, President |