RAWPIXEL.COM/SHUTTERSTOCK
Walking into our monthly meeting for preventive services program management, we couldn’t miss the giant chart with numerous hand-written lines of different lengths.
We were curious about its meaning, which we finally learned during the Data and Practice Learning segment. One of our first-line managers, Banabas Awedoba, explained the chart with an African story about a man who had lent another man 200 cowries. When the lender went to request the money back, the borrower pointed to lines on his wall, representing his many debts. The 200 cowrie debt was relatively small, represented by one of the shorter lines. He couldn’t pay it back because he had to pay back all the larger debts first he said, laughing loudly.
Banabas used this story from Chinua Achebe’s novel “Things Fall Apart” to share the importance of understanding data, which was what the lines represented. He also used it to draw his peers into a conversation about data about family team conferences that he had been studying with the support of a program evaluation analyst. The prevention program he was evaluating works with families to keep children safely at home with their parents.
Program Data: Make it matter
We have program data discussions with our board and management teams regularly. Many of our managers are social workers who are experts in working with people but can feel intimidated by numbers and graphs. We found that when our “people” people were in data discussions, only two or three people — our program evaluation staff and senior leaders — ended up doing all the talking.
Wanting to engage our managers in using data to problem solve, our colleague Kristen Ragusa developed a framework that we have introduced in other programs, which we call Data and Practice Learning.
Here’s how it works:
First, managers select a theme, based on their assessment of program priority areas. Within our prevention program, managers have selected topics such as domestic violence, families with infants, connecting with family’s networks and engaging teens.
Second, a program evaluation analyst meets with the manager to tap into the manager’s curiosity about what they want to learn and decide how to structure an analysis. When Banabas led the discussion on the frequency of meetings with families, he wanted to know how well we were doing as a preventive program by looking at how many family team conferences were occurring at expected timeframes. He also wanted to know if some managers were doing better than others. Banabas was finding it hard to make the family meetings happen and was genuinely interested in improving his team’s success by learning from his peers.
Third, a program evaluation analyst or program leader works with the manager to help them prepare to facilitate a discussion with her or his peers by reviewing the analysis. They review:
• What does the data show?
• What questions are answered and not answered?
• What hypotheses do the managers have based on the data?
Fourth, the manager leads a learning conversation with their peers during a management meeting. When Banabas presented the chart with the lines, he captured everyone’s attention. The story initiated a conversation in which Banabas explained data about how frequently family team conferences were occurring. The rate of occurrence varied among managers, and he facilitated a discussion that led to idea-sharing as managers described what they were doing. One manager dedicated one staff person to scheduling conferences. Another trained all their staff in conducting the conferences.
They also shared stories about engaging families they had a hard time bringing to meetings, which led to practical improvements. Within a year after the discussion Banabas led, we kicked off some new approaches and saw a marked improvement in the number of conferences held in our preventive program, an increase from 65 percent of a random sample of families to nearly 80 percent.
Data ties stories into patterns
These discussions have had several positive effects in addition to helping people use data to learn and make decisions. The main impact has been to bridge the difference between “people” people and data people by helping the “people” people see that data reveals stories and allows us to tie individual situations to bigger patterns.
And it gives data people more access to the human story behind each data point, helping to ground the data in real-life circumstances. Our discussions have helped managers increase professional confidence and further develop presentation, facilitation and data-interpretation abilities that deepen their management skills.
Data and Practice Learning lets managers share knowledge and drive conversations based on empirical evidence rather than memorable one-offs that can skew interpretations of what’s prevalent. We’ve seen that as managers used data to talk about issues, such as family meeting rates, the discussions went from being about either practice or data to an understanding that when practice and data are integrated in the discussion, we develop real solutions.
***
Bonnie Kornberg is the chief performance officer and Sharmeela Mediratta is the vice president for family and community support services at Graham Windham in New York City.