Archives: 2014 & Earlier

Don’t Be Scared — It’s Just an EBP

 

Youth-serving programs are told that there are lots of good reasons to implement evidence-based practices (EBPs). But mandating that agencies adopt EBPs is a touchy matter, as the state of Ohio recently found out.

Like many states, Ohio decided a few years ago to use EBPs to raise the quality of care in its mental health system, which included agencies working with youth. But when state officials talked about adopting a state rule requiring EBPs, “we got a huge firestorm,” said Dee Roth, chief of the Office of Program Evaluation and Research (OPER) in Ohio’s Department of Mental Health.

“What we were going to have was a rule that said, ‘If you’re a small agency, pick one EBP off this huge list and do it. And if you’re a big agency, pick two and do those.’ And just that much was enough to have a firestorm in the community about who’s going to make up the list and ‘What do you mean we’re not doing evidence-based practice now?’ ”

Sensing that wielding an authoritative stick wouldn’t work, Roth said, “We finally just decided to abandon the battle and go with the carrot approach.”

It’s an interesting lesson for government agencies as they think about how to push providers to use practices based on evidence.

The Carrot

Ohio’s approach started with the creation of a training, support and research partnership that includes OPER, the private research firm Decision Support Services, researchers from Ohio State University’s Fisher College of Business, and nine centers scattered throughout the state that specialize in promulgating EBPs.

The state commissioned those Coordinating Centers of Excellence in 1999, with each one focusing on implementing at least one EBP. Roth said the centers serve as resources for the state, teach and train agencies on the use of their specialty practices, monitor fidelity to the practice model and provide technical assistance.

Once that infrastructure was in place, Roth said, stakeholders realized that the decision to entice agencies to adopt EBPs, rather than force them to, “made it more important for us to understand what the variables are” that affect successful implementation. So the partnership created the Innovation Diffusion and Adoption Research Project to find out about those variables.

The project is a two-year longitudinal study that focuses on four particular EBPs and seeks to answer two key questions: What predicts whether an agency is willing to adopt evidence-based practices? And after an agency adopts an EBP, what predicts whether the practice will stick within the organization and produce good outcomes?

The study asks those questions within the frameworks of four models that are common in the fields of organizational and industrial psychology: Adoption Decision; Multilevel Model of Implementation Success; Cross-Phase Effects on Implementation Outcomes; and Effects of Implementation Variables on Outcomes Over Time. The study’s emerging results confirm that the models can provide commonsense guidance to organizations as they consider EBPs.

Don’t worry; it’s not as nerdy as it sounds.

Model Behavior

A recently published report on the research project – “IDARP: Moving from the diffusion of research results to promoting the adoption of evidence-based innovations in the Ohio mental health system” – described the models like this:

Under the Adoption Decision model, the decision to adopt an EBP is “negatively related to the perceived risk of adopting, and positively related to (a) the organization’s capacity to manage implementation-related risks and (b) [its] historic propensity to take risks.”

The Multilevel Model “expresses the idea that factors at many different levels of analysis impact implementation success.” Those levels include: environmental variables (professional norms in the industry), interorganizational relationships (the quality of communication with peers and supporters) and organizational variables (such as culture, size and the extent to which new approaches are encouraged without fear of reprisal for failures).

The Cross-Phase Effects model says three phases of the adoption process are likely to affect the implementation: initiation (awareness of a need and a search for solutions), decision, and implementation (working out details, securing resources and training staff).

The Effects of Implementation model incorporates two messages: that “past implementation policies may or may not explain” current implementation outcomes, and that “what is likely to matter most are present implementation policies and practices.”

In order to measure the variables defined by the models, the researchers interviewed “key informants” – that is, program staff and community collaborators – from 91 behavioral healthcare organizations that had adopted, or were considering adopting, one of the EBPs that were the focus of the study.

“The mix of key informants sort of evolved with the process,” explained Phyllis Panzano, president of Decision Support Services. “The ones we approached for the decision-making model were folks identified as being actively involved in the decision. As we started tracking implementing agencies, our shift of key informants went more to team leaders or therapists. We wanted to track things … as [they] hit the ground.”

Model Results

“It turned out that the biggest factor for an agency [deciding to adopt an EBP] is their perception of the level of risk in taking it on,” Roth said.

According to an article Roth co-authored for the July 2005 issue of Foundation News, agencies that adopted EBPs concluded that the benefits of doing so outweighed the risks, that the risks were manageable and that their organizations were innovative.

The researchers concluded that giving organizations as much information as possible about EBPs – including scientific evidence – helps them decide if they are capable of managing the risk.

Researchers found a couple of other important variables.

One is performance monitoring: “whether the organization and leadership are actively engaged in watching the implementation, seeing how it’s going, seeing if it trips up, propping up as necessary,” Roth said.

Another is top management support: “if the agency leadership and the executive director are all hyped up about this at the beginning … and they’re real supportive and they’re running around caring about it,” she said.

Identifying such variables, Roth said, led her agency to the next step: figuring out how to affect those variables in order to persuade agencies to get enthusiastic about adopting EBPs. Roth said she has told the nine coordinating centers, “Get out there and be able to provide the kind of evidence that it will show somebody that it’s less risk than they fear.” That effort is just beginning, as data collection ended just six months ago.

“We’re now shifting from data gathering … to getting some of these publications out and analyzing our data more formally,” Panzano said.

She also shared an application of the study’s findings that can be applied to all human services work. “It’s better to start out slowly with moving some of these EBPs into the field and working out the unique implementation barriers, before efforts are made to roll it out on a widespread basis,” she said.

“The experience of those early adopters – and the success of their efforts – really snowballs. That information gets out in the field. And if the information isn’t positive and it scares people off, you’ve done more harm than good.”

Contacts: Dee Roth, rothd@mh.state.oh.us; Phyllis Panzano, phyllis.panzano@dssincorporated.com. To learn more about IDARP, go to www.mh.state.oh.us/oper/research/activities.idarp.index.html.

Comments
To Top
Skip to content