It’s a common and much complained-about paradox: How can nonprofit youth-serving agencies pay for the rigorous evaluations that funders require, when no one will provide enough money to carry out rigorous evaluations?
Psssst: A nonprofit in Los Angeles has figured out how to fund high-quality evaluations of its program while spending little of its own money.
The Phoenix Academy of Los Angeles is a 120-bed residential substance abuse treatment program for 13- to 17-year-olds, most of whom are placed by the Los Angeles juvenile courts. The academy is one of 11 operated by Phoenix House, one of the nation’s largest substance abuse treatment providers.
In lieu of hiring evaluators outright, Phoenix Academy has focused on parlaying its reputation, expertise and access to raw data into lucrative, symbiotic partnerships with research organizations like RAND Corp. that help the academy identify and secure evaluation funding.
Those partnerships have yielded mutually beneficial gold-standard evaluations, boosted Phoenix Academy’s quality improvement plan and garnered good publicity – delivering a lot of evaluation bang for relatively few of the academy’s bucks.
Getting in the Spotlight
It is a sign of this strategy’s success that although community-based substance abuse treatment for youth has been the focus of little scientific evaluation, Phoenix Academy crops up repeatedly in a handful of well-regarded research studies of adolescent substance abuse treatment. Those include the benchmark Adolescent Treatment Model (ATM) study of effective programs from the U.S. Substance Abuse and Mental Health Services Administration (SAMHSA), RAND’s Adolescent Outcomes Program (AOP) study, and Drug Strategies’ survey of highly regarded treatment programs.
Such studies, and their conclusions about Phoenix Academy, have been touted in press releases, posted on websites and published in scientific journals. For instance, in an AOP study published in the September 2004 issue of Psychology of Addictive Behaviors, RAND researchers concluded that “Phoenix Academy treatment is associated with superior substance use and psychological functioning outcomes” when compared with alternative juvenile probation dispositions.
That finding led in May to the designation of the academy as a model program by the U.S. Office of Juvenile Justice and Delinquency Prevention. Phoenix Academy will appear in the agency’s online Model Program Guide and Database.
You can’t buy that kind of publicity.
“The Phoenix Academy project [the AOP study] cost well over $1 million,” says Andrew Morral, director of RAND’s Safety and Justice program and an investigator on that and several other adolescent substance abuse treatment studies. “No program that I know of could afford to pay that.” The study was funded by a grant from SAMHSA.
The academy also relies on its familiarity and reputation.
Operating in California since 1979, the Phoenix Academy’s name regularly came up in 2002 when the nonprofit research institute Drug Strategies asked an expert panel, national organizations and state alcohol and drug abuse agency directors to name the five best treatment programs in their states. The programs were then examined by Drug Strategies. The results were reported in the September 2004 issue of Archives of Pediatrics & Adolescent Medicine and published in a printed guidebook and on a website.
In interviews with Morral and Phoenix Academy Director Elizabeth Stanley-Salazar, several key themes emerged as guidelines for nonprofits seeking low-cost ways to fund evaluations through research partnerships:
• Work with other organizations to increase your viability and visibility, even though it may not produce immediate results.
Agencies “should be working through collaborations, coalitions and associations to try to do some of this work,” Stanley-Salazar says. “Maybe do some peer review, some peer evaluations down on the ground.”
Working together in coalitions makes small programs more appealing, especially in terms of their ability to provide researchers with diverse, representative sample sizes. “When you get into research, sample size dictates a lot,” Stanley-Salazar says.
She also says Phoenix House works to stay visible in the research community. The organization belongs to the National Institute on Drug Abuse (NIDA) clinical trials network, and its research department has met with the institute, the U.S. Center for Substance Abuse Treatment and the U.S. Department of Justice “to advise on what types of studies are needed, where in the field we could advance our understanding. But we don’t initiate the study,” Stanley-Salazar says.
• Sell your story to researchers and funders.
“It sounds very crass to say this, but to get funding, there has to be an interesting story to tell about the program,” Morral says. What’s hard to sell, he says, is “something that looks unique and irreproducible, or that’s connected to the personal beliefs of the counselors themselves or the director, and where there’s no way of knowing if the service offered this week is the same as the service offered next week.”
Agencies must pitch a proposal that “persuasively suggests that … evaluating this program is going to have ramifications for other programs as well,” Morral says. RAND believed that Phoenix Academy was providing care that represented one of the standards of adolescent long-term residential treatment – the therapeutic community model – and that measuring its effectiveness could cast light on the effectiveness of similar programs.
Programs should also familiarize themselves with the focus and mission of potential research partners. “RAND’s mission is to conduct research on improving substance abuse treatment in the country – what policies can be modified and improved so that care is improved,” says Morral. “That’s what we’re getting out of it: the opportunity to look at substance abuse treatment policies and procedures.”
Additionally, Phoenix Academy provided RAND with detailed in-house data and information, so that when Morral applied for grants “it didn’t sound like I was unfamiliar with them.” Also key: “They promised access” to youth from the program.
• When you think your program is ready, initiate contact with research organizations.
The seeds for the Phoenix Academy’s partnership with RAND were planted when the academy hired RAND for a relatively small project. “The first thing RAND did for us in 1995 was assist us in putting together our client information assessment tools – our client characteristics, intakes, screening and assessments,” Stanley-Salazar says. “That whole relationship evolved into various partnerships.”
• Watch for funding streams.
Stanley-Salazar says Phoenix Academy and RAND work back and forth taking the lead in identifying and applying for grants.
“There are opportunities that come out of the government sometimes, and that come out of foundations, that can and do pay for projects once in a while,” Morral says. For example, he says, SAMHSA pays organizations to look at the effectiveness of manual-guided adolescent substance abuse treatments in community-based settings.
• Roll up your sleeves.
Even if your program isn’t footing the financial bill for an evaluation, “there is a fair amount of work involved in talking with the researchers that are trying to get the funding, providing them input and information on what they’ve written about the program or are thinking about the program,” Morral says.
“There are difficult questions about access that have to be resolved,” he says. For instance: Can the researchers observe group sessions? Do the kids need to provide informed assent? How do researchers get parental consent?
• Use what you learn and maintain your research partnership.
Once RAND’s outcome evaluation study showed the effectiveness of Phoenix Academy’s treatment, what the program needed next was “not necessarily another study,” Stanley-Salazar says.
Stanley-Salazar says that after the completion of the RAND outcome study, Phoenix Academy moved on, instituting a continuous quality improvement program and business evaluation program with RAND’s help.
Morral says that after one high-quality outcome evaluation, most programs are able to say, “Look, we are evidence-based,” and gain a foothold in the expanding number of evidence-based practice lists watched by funders.